homepage Welcome to WebmasterWorld Guest from 54.161.175.231
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 195 message thread spans 7 pages: < < 195 ( 1 2 3 4 [5] 6 7 > >     
Many Weeks since the Panda Update - Any Improvements? [part 3]
tedster




msg:4293484
 12:18 am on Apr 7, 2011 (gmt 0)

< continued from [webmasterworld.com...] >

I'm not at all convinced that it's any kind of penalty

After some discussion, I think I should explain this a little more. There is one way we might say Panda is APPLIED like a penalty. You basically have two kinds of affected pages - the primary pages that Panda assessed as low quality, and then the rest of the site that received some kind a site-wide demotion.

IDEA ONE
The site-wide demotion is applied like a penalty in that a negative factor is consistently applied to rankings across a lot of pages. However, I'm not assuming that rankings will return after a set "time put" period in the penalty box. If Google feels they identified pages that give their users a poor experience, then they would not let those pages rank again just because a certain amount have time has passed.

IDEA TWO
The site-wide demotion seems to flow backwards through the site's internal linking. This I'm still not totally certain of, but there does seem to be a pattern that says "the negative site-wide factor is strongest for pages that are just one click away from the really bad page and not as strong for pages that are more distant."

Does "idea two" line up with what others see on affected sites?

[edited by: tedster at 3:01 am (utc) on Apr 8, 2011]

 

Dan01




msg:4297373
 5:05 am on Apr 14, 2011 (gmt 0)

Outland88:

As we bantered back and fourth in email he shocked me a few times stating many areas were going to be virtually locked out to competition, whatever that meant to him at the time.


I think that is their goal. Let me explain that. I think they all want to become "the" Internet. Yahoo was had people stuck on their site for years. Same with AOL. Both Google and Bing do it. For instance, I have seen Bing open Wikipedia up in a iFrame. Google is kinda doing that with image search now. Facebook would love to be "the" Internet. Why go anywhere else? Google has their own shopping now, why go to Pricewatch? It is the natural progression for these companies.

bramley




msg:4297374
 5:06 am on Apr 14, 2011 (gmt 0)

When one logs out, personalised SERPS etc stop. But does it stop monitoring ...? In the case of the toolbar.

bramley




msg:4297380
 5:14 am on Apr 14, 2011 (gmt 0)

Its sensors are not touch pads and camera eyes, just human behaviour

bramley




msg:4297392
 5:28 am on Apr 14, 2011 (gmt 0)

If ads play a part, it is surely % content and how 'in your face' they are that causes demotion; they can be fine if done reasonably

zerillos




msg:4297414
 7:09 am on Apr 14, 2011 (gmt 0)

hey, what's your opinion about this?
many people have taken down lots of pages.
let's say you took out pagers a(i), where i from 0 to 5000, and page b. pages a(i) all link to page b.

what if gbot spiders page b before pages a(i), and gets a 404. gbot knows pages a(i) link to b, which means now a(i) have broken links on them. this has to lower the overall 'quality' factor of the website.

could this temporarily actually lower your rankings? until gbot gets all the pages that is, and sees that a(i) are also 404, and that in fact there are no broken links.

in my opinion this could be a reason why people don't see any improvement so far...

rlange




msg:4297552
 1:46 pm on Apr 14, 2011 (gmt 0)

zerillos wrote:
could this temporarily actually lower your rankings?

I doubt it, and for the same reason that Google keeps checking 404'd pages: it could have been a mistake. I'm sure that particular crawl error would have to exist for a certain period of time before the algorithm starts penalizing for it.

--
Ryan

networkliquidators




msg:4297609
 3:34 pm on Apr 14, 2011 (gmt 0)

Here is some evidence that their roll out is broken. I hate to use specifics but u need to see this. This domain i put up has been pulled down for 8 months (meaning dead domain with no content).

The domain is juniper.us. Yet, it ranks #1 for this query. [google.com...]

I'm not promoting anything here, just saying, their algo can not be working as intended. How can we trust Google's quality results when a domain with no content is #1 in their index for 8 months.

mike2010




msg:4297630
 4:05 pm on Apr 14, 2011 (gmt 0)

i'm still at like #18 , where I used to be #3 forever.

no changes still..

Jane_Doe




msg:4297634
 4:10 pm on Apr 14, 2011 (gmt 0)

I didnít put much stock in what was said at the time and still donít but Google could be delving into how many web sites are being run by an entity.


I don't think it is number of web site per se, but I suspect just that low quality pages can bring down a whole site, low quality sites can drag down others with the same owner.

johnhh




msg:4297955
 10:39 pm on Apr 14, 2011 (gmt 0)

no changes here in the UK - still 30% down - I've retaliated - dropped all adsense and paused all adwords

Running A/B tests on certain pages - our Pandalise ( or Pandalized if in USA ) is not site-wide, just certain pages.

Luckily most of our traffic comes from non-Google sources.

dickbaker




msg:4298000
 11:55 pm on Apr 14, 2011 (gmt 0)

I've been reading a lot of comments suggesting that drop-down menus may be a contributing factor in this. Since every page on my site has a drop down menu, and since I think it helps the visitor to have the menu, I was considering changing the links.

Right now the links in the menu are standard <a href="SomePage.html">Some Page</a>

If the links are in javascript, like this: a href="javascript:;" onClick="goToURL('parent','http://www.example.com/SomePage.html');return document.returnValue"

I'd hate to lose the menus because of the user experience, but I need users first in order for them to have an experience.

Do you think Google would treat the javascript links the same as standard links when it comes to demotion?

ascensions




msg:4298007
 12:12 am on Apr 15, 2011 (gmt 0)

Google's new motto for Panda was stolen from Eat Pray Love:

Ruin is a gift. Ruin is the road to transformation.

c41lum




msg:4298013
 12:38 am on Apr 15, 2011 (gmt 0)

How drastic has anybody got, im thinking of stripping my site back to just a few pages from 30,000. Me and my staff have spent 8 years creating the content but if it helps our top 3 pages recover thats what we will have to do.

walkman




msg:4298015
 12:45 am on Apr 15, 2011 (gmt 0)

How drastic has anybody got, im thinking of stripping my site back to just a few pages from 30,000. Me and my staff have spent 8 years creating the content but if it helps our top 3 pages recover thats what we will have to do.


I would do that too, but no one has come back so far despite many changes. Google is telling nothing and what they have said so far about Panda have turned out to be lies. So you may remove your 30,000 pages and regret when your competitors come back.

I am absolutely, positively sure that all my pages have plenty of unique and useful content, especially this last month. But Google doesn't care.

c41lum




msg:4298043
 1:43 am on Apr 15, 2011 (gmt 0)

Hi Walkman, Im seriously considering it.

Has anybody on here tried that approach yet?

walkman




msg:4298060
 2:28 am on Apr 15, 2011 (gmt 0)


Hi Walkman, Im seriously considering it.

Has anybody on here tried that approach yet?


Plenty, read about on Google support forums too. No one has come back yet and Google isn't saying anything. Hang tight for a while, SERPS are shifting all day today

ErnestHemingway




msg:4298066
 3:02 am on Apr 15, 2011 (gmt 0)

My guess would be the best way is to actually test out everything to see if there are any changes. What is interesting to note that a lot of sites that did lose ranking were sites ranking #2-#3-#4th etc.

If you dig little further and you will see a lot of these sites were last indexed April 9th there is no fresh update.

I have had sites that used to get index on pretty much daily basis or every 2 days. But since the update sites that lost position have not been indexed.

I have already made tons and tons of changes on some of them and will share data once it gets indexed to see if I get any boost in rankings.

dickbaker




msg:4298077
 4:00 am on Apr 15, 2011 (gmt 0)

Sorry to be a pest about this question, but does anyone have any idea if the javascript link I described seven posts back would be counted as interlinking on the site, or if Google would not treat the javascript links as regular links?

I never gave it any thought nor used javascript links, so I don't know how Google treats such links.

walkman




msg:4298089
 5:19 am on Apr 15, 2011 (gmt 0)



If you dig little further and you will see a lot of these sites were last indexed April 9th there is no fresh update.

I have had sites that used to get index on pretty much daily basis or every 2 days. But since the update sites that lost position have not been indexed.

I still get fresh tag on 4 of my pages and 2 new pages I added today were in Google's cache within a hour.

Google grabs about 50% of my pages everyday but somehow doesn't show up with new cache dates.

semseoanalyst




msg:4298093
 5:37 am on Apr 15, 2011 (gmt 0)

Google q1 revenue results came out:
[investor.google.com...]
Not sure any one shared before or not just sharing...if it need another thread,Ted please process that..

tedster




msg:4298101
 6:07 am on Apr 15, 2011 (gmt 0)

@dickbaker, I can share the results of some testing I did almost two years ago. If the full URL is directly in the JavaScript code (as in your example) Google will read that and treat it as a link. That is, the will establish a "virtual link" that stands in for the JavaScript in their web graph.

But if the JavaScript calls a function defined externally - one that generates the URL from an array or something like that - then Google has more of a problem and it's likely not to be scored as a link.

JavaScript parsing is an area where Google is evolving rather rapidly - so this may have moved further since my tests. But according to more recent tests that deadsea reported on here, they still don't evaluate those functions that call external scripts.

zoltan




msg:4298123
 8:00 am on Apr 15, 2011 (gmt 0)

I wonder if too many pictures can hurt your rankings. I have many sites and one that was hit the most has the following pattern:

"Country widget page":
10 links to different wisgets:

Big picture (180x180) | Widget title (+ Link to detailed widget page) | Small description of widget | Up to 8 small pictures (60x60)

If visitor is clicking on one of the small pictures, the big picture of 180x180 is changing to the bigger version of small picture clicked (Ajax).

So, in one page I can have up to 90 pictures. These pages were hit the most.... Can anyone see a similar pattern?

zoltan




msg:4298139
 8:52 am on Apr 15, 2011 (gmt 0)

And another pattern I see on many websites we manage. 80% of my sites were not affected at all (except the minor +/- 3-5% daily fluctuations).
On the rest of 20% affected sites I see anywhere between 10-50% loss of google traffic.
And the traffic loss from google is quite constant / site and it is usually 10 or 20 or 30 or 40 or 50%. So, it is not fluctuating / site, if one site got 15k visitors / day from google and lost 30% the traffic is always close to 10.5K (10.2, 10.6, 10.7, 10.3 etc.). So, it definitely looks like a sitewide penalty....

[edited by: zoltan at 9:45 am (utc) on Apr 15, 2011]

Shatner




msg:4298156
 9:38 am on Apr 15, 2011 (gmt 0)

@zoltan it's definitely sitewide. I lost 50%, and it hasn't fluctuated... until Panda 2.0 this week when I lost another 25%. And that has stayed steady too. Very little fluctuation, even though if I look at individual keywords sometimes I see fluctuation.

Panda, whether Google admits it or not, is definitely applied site wide. That's really the only thing that anyone is certain about.

dickbaker




msg:4298301
 3:32 pm on Apr 15, 2011 (gmt 0)

Thanks for the reply, Tedster. The drop down menu is better for visitors than making them go to one central page every time they want to get a link to another page on the site. User experience should trump Google's perception of "quality".

indyank




msg:4298326
 4:06 pm on Apr 15, 2011 (gmt 0)

But isn't it better to move the javascript to a separate file and block it in robots.txt? Will google still try to parse it?

robert76




msg:4298364
 5:31 pm on Apr 15, 2011 (gmt 0)

Concerning Panda being applied site wide, why would I see some keywords continue to rank on first page while others dropped many, many pages. Was the net sum meant to drop by 50%?

walkman




msg:4298461
 7:54 pm on Apr 15, 2011 (gmt 0)

Bing today has sent a few more visitors than Google. A first, and just yesterday Google barely beat a combined Y and Bing, which in itself meant a major, major drop.

So yesterday I lost traffic again.

tedster




msg:4298462
 7:56 pm on Apr 15, 2011 (gmt 0)

Panda, whether Google admits it or not, is definitely applied site wide.

Concerning Panda being applied site wide, why would I see some keywords continue to rank on first page while others dropped many, many pages.

Here's the way Google engineers have explained it, as I understand them.

First low quality pages are flagged and tagged to rank lower. Then a site-wide factor is calculated from those "poor page" scores and that factor modifies the rankings for other pages.

As a final step {especially with the recent second roll-out} some strong pages are located and boosted (possibly given an exception from the site-wide factor?). This happens even on sites that received a strong negative ranking overall.

robert76




msg:4298512
 9:06 pm on Apr 15, 2011 (gmt 0)

Something big is brewing concerning IP addresses:
http://www.ysmallbizstatus.com/status/archives/3965

All Yahoo stores are on the same IP. Many of their larger ones were hit including some that made Sistrix list.

Shatner




msg:4298523
 9:18 pm on Apr 15, 2011 (gmt 0)

This should probably be treated as rumor...

But I spoke to a friend who managed to actually get a couple of Google Engineers on the phone, and they told him in no uncertain terms that he should not expect to see his site or any site come back for 5 or 6 months at the soonest, and that this is the way it is for the long haul.

This 195 message thread spans 7 pages: < < 195 ( 1 2 3 4 [5] 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved