Forum Moderators: open
that'd be coolI agree at all.
But I fear that Google is not able [and never will] to understand if a fragment of Javascript is -> 'window open'.
I don't talk about clear, linear, javascript.
I talk about fragmented and/or crypted javascript.
If I crypt a javascript, everytime in a different way [not difficult, with some knowledge in scripting language (->python-perl-etc)], I can charge the browser for the 'decrypting' work, and the user will not be affected.
But, think of the work for the 'javascript-virtual-machine' of Googlebot & his friends..
A lot of work.. too much.
This is the reason why Google is able in some way to track spam, but not to track 'bad javascript'.
A little humour note: GG often has said that Google don't penalize pop-ups.. hehe.. the truth is that G can't penalize pop-ups.
cminblues
[edited by: cminblues at 4:51 pm (utc) on Oct. 13, 2002]
If we're voting, put me down for pop-up=lower PR
Great idea heretic! (but I doubt anyone @ google cares what we think)
zeus
the truth is that G can't penalize pop-ups
Right. That is computationally unfeasible. As I have said before, Google is a search engine, that munches spidered pages and spits out the degree of correlation between a search string and an entry on its database. Nowhere on that definition comes to discriminate between the use of tecnology.
The battle against popups has to be fought at the client front. Mozilla users can install a preferences toolbar and switch on/off the popups; another browsers have to be pushed to provide a similar functionality at the main UI.
the truth is that G can't penalize pop-ups
Right. That is computationally unfeasible.
Disagreeing with Count D (duckula) here, however, I'm not saying that sites with popups are penalized at all. I'm just saying that they (probably) can detect sites that use popups.
What makes it 'computationally unfeasible'?
I can think of two possibilities.
1) If a browser can do it, written by programmers, why can't the programmers at google do it? Granted, if the JS is in a disallowed section of robots.txt, and google is obeying it, it would not be able to see the JS. Probably not a solution if they really wanted to check for popups.
2)404 errors from their cache view. You've seen it. On any of their serps, find a page known to have popups. Take a look at the cached page. See this link: [webmasterworld.com...]
Therefore, examination of the logs for 404 errors with the referer as the cache could indicate possible popups, because of the base href not being set correctly. (Again, this doesn't prove there's a popup, but if the 404 error indicated a non-graphic file [returns text/html supposedly], there's a good indication that the site is using a popup or meta-refresh type redirect).
Personally, I don't think popup sites are penalized because I see too many of them. But I wouldn't sell google short on the ability to detect them.
The use of pop-overs in a site is quite a good feature when the information it delivers is relevant to the site. For example... opening thumbnail photos to a larger image placed in a pop-over... or describing for the viewer the difference between a webmaster and web designer where each topic is just a couple of sentences delivered via a pop-over.
What makes it 'computationally unfeasible'
Pay atention to cminblues's comment, and search for others that had previously been sent.
If I crypt a javascript, everytime in a different way [not difficult, with some knowledge in scripting language (->python-perl-etc)], I can charge the browser for the 'decrypting' work, and the user will not be affected.
But, think of the work for the 'javascript-virtual-machine' of Googlebot & his friends..
A lot of work.. too much.
Let's say that a web coder writes a code to pop up windows with a function, apparently innocent, called simply 'foo()'. There is not enough information on the title itself to determine that this is a killer popup generator.
Let's say that the popup detection code goes further and tries to determine if the code itself of the function raises a window. Then, you would need to implement a parser for the javascript function, expensive and will not capitalize at long term.
Being extremely optimistic and getting to this point without expending too much resources, there is still the problem of finding out if a function can open a window without explicitely asking to do it: that is the encription part.
Make a metafunction that creates a function that asks to open a window.
Or create a metametafunction that creates a metafunction that creates a function that asks to open a window.
That happens at the client side, that may be a pentium or better, so the client side expenses are negligible.
But implementing it at google's side, having to parse every received page to check every level of codification, that is expensive. Those cycles can be put at better use, and maybe any use is better than following red herrings at every piece of javascript. That expedition would not give noticeable results hunting weaponized popups, will give absolutely no advantage at determining the relevance of a search, will expend coders and CPU, and all because the clients have not a quick and easy way to turn off javascript the 95% of the time they won't need it.
Pay atention to cminblues's comment, and search for others that had previously been sent.
Please pay attention to what I said:
1) If a browser can do it, written by programmers, why can't the programmers at google do it? Granted, if the JS is in a disallowed section of robots.txt, and google is obeying it, it would not be able to see the JS. Probably not a solution if they really wanted to check for popups.
No, I don't think google would look or try to decode the source code (now or yet) All I'm saying is that if the code is available, and any old browser can do it, why can't google? Agreed, it would be expensive, but they have over 10,000 computers doing the job. The only reason I posted that was to make the point that it is possible. You don't think they have coders at google writing parsers for JS? Bottom line, don't underestimate them.
Anyway, my main point was about the 404 errors that some sites will produce with popups when a cache view is requested from google. That sends a red flag immediately. No programming required.
Yes, I know that Google has the resources, but I think that such resources are valuable at another places, let's say, at returning better searchs. I don't understimate them, I just think that they're a search engine, and determining the use of technology falls outside their reach, from my point of view.
Your point is appreciated; but if they start penalizing sites with popups won't take long before the first weapon grade undetectable popup functions start populating the web; call that some kind of immune reaction. 'Futile' is the word that comes to me. Then again, I'm naturally pesimistic...
I didn't attempt to be offensive, sorry if seemed that way.
Didn't take it that way at all.
If you read my posts, I'm saying that they're not putting any resources into it (and again, that's my guess, I'm not working there). I just wanted to state that they could if they wanted to, and don't underestimate them.
Google does respect robots.txt, so far, at least as far as my sites are concerned. So placing .js files in an unspiderable directory could stop parsing of the files. 404 errors, on the other hand, are created by the browser, based on the source of the cache files on google. They don't like the extra bandwidth.
I'm in agreement with you. I don't think they penalize sites with popups. But they can penalize anything they want to. There could be a popup penalty in the future...Ya never know!
[And I don't think that the Google coders are not able to write a super-efficient Jscript-virtual-machine', I think the use of that will be too much expensive for their computers.]
But, about 404 error codes, how may Google know the domain/url of a site giving a, say, "popuppage.html" 404 error in Google logs?
Because of the referer, you say.
So, if G. want to be 'alerted' about a popup-generator page, at least someone must see the cached page.
If nobody sees, no alert.
But this [somebody seeing the cached page and requesting a non-existant page in google] is not enough.
Mr. Pop-up may write his code in at least two '404-escaping' ways:
1] In a 'crypted' way [see my post and Duckula post], he adds a 'document.url/referer' check in the jscript.
This, makes the jscript don't open any window if the URL in the browser location-bar of the user has the magic string 'q=cache'.
2] The crypted jscript opens the window with an absolute URL.
So, if someone sees the cached page, the pop-up opens in the right way, and nothing happens to Google error log.
That said, I think the Google coders are superbly talented hackers.
But I think also that they are too much smart to start a war lost from the beginning ;)
cminblues
<edited>referer -> document.url correction</edited>
Answer this: A person has the google toolbar installed (this is an IE function) A Popup is initiated and the request is also sent to the toolbar as a notification. (I don't think it has to be open on the poup up for the toolbar to be notified). Google phones home. I might be making a wrong assumption about the toolbar notification, but remember, google can update itself automatically in the future.
(No, I'm not waiting for the black helicopters to come swarming down, this is just another suggestion).
A Popup is initiated and the request is also sent to the toolbar as a notification. Google phones home.
I agree with you (but don't have now a testing IE).
So, a possible pattern for pop-up recognization may be:
some googlebar_adv_feature-user requesting in a short time at least 2 [3 and more in the worst pop-up cases he..] pages..
But, think of a little timeout in the opening pop-ups [realistic scenario, if we talk about 'spamming/boring' pop-ups]:
What is the difference, in the Google-googlebar eyes, from this [multiple pop-ups], and a user quickly-clicking in some site?
Nice example, anyway.
And a proof that the Googlebar may be used in a lot of ways.. :)
cminblues
Over time, badly used popups would cause a page to be less competitive in Google that it could have been, by hindering growth of its link popularity.
1. IMO of course, there is no such thing as a "clean" popunder ;)
2. This is debatable of course, but returning searches with a penalized PR for popovers and unders makes the search "better", IMO.
3. G does not get 100% of the spam. But enough to make it darn good. There must be as many ways to hide spam as there is to hide popups and if G finds popups as effectively as they find spam, life will be very good.
Just by going after the standard way of opening up popups they will be doing a tremendous service. If these sites get wind of it, sure, they will come up with a way to get around it. Once that method is discovered by enough sites to be noticed by google, it will be easy to add that method to google's algorithm. Just like regular spam sites, this is a game that would be played back and forth. But all of us here have at least one site. Would you risk losing pagerank by putting a "G-safe" JS popup if you knew that G was trying to figure out a way to find G-safe JS popups to penalize them? Considering the obsession w/ G in this forum, the answer is self-evident.
Regardless, I think the issue here is money. The biggest users of popups besides porn sites are the megacorporations. G probably does not want to risk political damage by rocking the boat on this one. I think they would easily survive it and gain so much credit by the entire internet community for ridding us of this scourge, but for political reasons, I doubt it will ever happen. I didn't really think of that when I brought up the suggestion..but it's a nice fantasy anyways :)
Would you risk losing pagerank by putting a "G-safe" JS popup if you knew that G was trying to figure out a way to find G-safe JS popups to penalize them?
heretic:
With the spam issue, Google penalizes sites doing, from his point of view of course, spam.
With the JS issue, should have Google to start penalizing the sites with fragmented/encrypted JS, only because he don't understand them?
Again, I think that the JS 'parsing' capability is out of the resources of Google, and maybe any other big search engine.
cminblues
Now, both are advertising medium. What has this to do with Google ranking algorithm?
Google, penalize sites that use popups.
Google, penalize sites that use banners.
Google, penalize sites that use ***.
[for *** substitute any pet-peeve you have related to surfing.]
Come on. Popups is not a SE spam technique and therefore is not and should not be a part of Google algo. Google's success is based on its focus on relevancy and user experience (fast loading, clear pages, no pop-up, no pop-under).
If you don't like popups or popunders, simply don't pop-up/under them.
So say I'm searching for forums about search engines. 1 has popups one doesn't. The default search would show both but if I typed "nopops:search engine forum" the 1 without would be the one I'm searching for. I think that is a fair use of such a feature as it would just be another search parameter I could employ. :)
Let's assume I make a search for 'apples'.
A page about 'apples' with a pop-up about 'cars', IMO is differently relevant than a page without this pop-up, or with a simple pop-up about 'apples'.
So I think this is, in some way, a relevance issue.
I would be very happy if Google were able to find and understand pop-ups, and therefore able to get this parameter in the algo.
But this is not realistic, I fear.
cminblues