| 1:39 am on Jul 13, 2012 (gmt 0)|
|Maybe comparing them to other well-executed designs that they know of from human tests. |
The day that art can be judged by a coder's idea of what art is, will signify the end of all art.
| 2:13 am on Jul 13, 2012 (gmt 0)|
Well, I'm not really suggesting they would care about the "art" of it, at least not directly. But it would be possible to detect through artificial intelligence whether a user interface corresponds to other known useful interfaces. And I don't think that really signifies the end of good design at all.
| 3:01 am on Jul 13, 2012 (gmt 0)|
So if you have an image that is part of your theme for a green skin, and then you use Photoshop to change its colour spectrum into red tones and another for blue tones. Or if you crop and only use 90% of the original image, how is a bunch of formulas going to realise that they are all they same pic, little lone compare it to any other image and discern which is better design?
Only a minutely small percentage of people have the ability to be creative and that percentage varies between cultures. That being said, a machine hasn't a hope in hell of judging good design. Ok, so a clever bunny might think that such a thing, however complex, can always be programmed. If that was the case then after more than 2 decades of such brilliance one would expect to see software able to correctly space the text in headlines. But that's not the case because if machine spacing was as good as what a typographer can produce, then you wouldn't see errors in almost every company logo that has been produced since commercial artists moved to computers.
| 3:02 am on Jul 13, 2012 (gmt 0)|
. And how much real content there then is left
|analysing the screenshot to see how much of the page is taken up with templates / how much is taken up with ads |
Already discussed in March 2011..when I brought it up.. :)..the ads to "space" content ratio..very few wanted to hear..
Most people then spent months saying it couldn't be about the ads to content ratio..
Matt just confirmed in the interview with Eric ..that it was indeed about the ads to content ratio..
As I also said at the time ..if it isn't about the ads to content ratio..what the hell do people think preview was for..and why it is the size in pixels of a site seen via a mobile device ratio..
|Only a minutely small percentage of people have the ability to be creative |
I'm one of that minutely small percentage ..a professional artist for around 40 years now..
Nothing to do with "cultures"..some cultures just allow more expression than others..and put a greater store by visual design factors and elements..webmasters think it is all about the words..it isn't..
[edited by: Leosghost at 3:08 am (utc) on Jul 13, 2012]
| 3:07 am on Jul 13, 2012 (gmt 0)|
I'd say it's a very interesting idea. Since we're theorizing or "blue-skying it" here, a further extension of our idea occurred to me.
Google could begin with browser data know they can depend on to a high level of significance. Then they could extend that solid picture to analyze sites from screenshots where there have not been enough Chrome users to depend on the browser data alone.
One area that still doesn't fit with this theory for me is the fact that Panda is said to be directly about content quality - even to the degree that we were told what kind of questions were asked about websites at the very beginning of the process. So if this theory holds water, it still needs to be coupled with another kind of content analysis. Going with a right brain measure for what is essentially a left bran goal doesn't immediately make a lot of sense to me.
Another ingredient in the mix is that they don't need to Panda anlyze "every website" and we know, in fact that they didn't. Amit told us that Panda 2.0 went further into the long tail than Panda 1.0 did. So apparently they were first looking at the sites that get the most impressions in the search results, and then laster they were going deeper.
| 3:14 am on Jul 13, 2012 (gmt 0)|
Another factor just occurred - the Above the Fold" algorithm [webmasterworld.com] was rolled out almost a year after Panda 1.0 - and yet it seems that is a relatively simple screenshot analysis compared to this current theory.
| 3:34 am on Jul 13, 2012 (gmt 0)|
@tedster That's a good point. The above the fold algorithm seems rather basic in comparison. Does somewhat fly in its face. Unless the previous implementation didn't look at above-the- fold specifically, but later decided to give that part extra weight. And yeah, not sure that this explains the idea that content is valued more. Unless it's a basic measure of x% of the page is taken up with actual content rather than ads + site template. Which isn't what I think of when I think of quality content, but it could be defined that way.
@kendo there are a lot of parts to design, some more subjective than others. As an extreme example, Google can fairly objectively state that if you stick black text on a black background, that will not be a good design outcome for their audience. Of course, you may be an artist and this could be some sort of intentional artistic vision. Remember, this is all about Google satisfying their searchers, not about directing people to the most artistically revolutionary site. The kinds of design choices that I think could be used in an algorithm are more related to the basic layout, font-sizes, and distribution of ads etc.
@Leosghost sorry I missed your post. I'm sure my thinking isn't original - there's so many theories it's impossible to keep track :) The site preview is not a bad way of getting some extra information about what kinds of layouts people like to see. At least on a basic level that gives some extra data points. And for competitive phrases it probably gives them quite a lot of useful extra information.
| 4:00 am on Jul 13, 2012 (gmt 0)|
|..the ads to "space" content ratio.. |
Is there a difference between the ads to "space" content ratio and the ads to content ratio?
| 4:11 am on Jul 13, 2012 (gmt 0)|
In Scandinavia (forget which country) there is a early Fauve painting of black on black, and it's quite significant. What is not significant is coders deciding that good design can be defined by a bunch of scripts. If they believe that then they are fooling themselves and will try to fool us, as if they aren't already.
@Leosghost being an artist doesn't make one creative. If you are blessed with creativity then you may be lucky or you may be cursed because real creativity is not something that surfaces only when summoned. Regarding my comment about its relationship to culture, school teachers catering for multinationals can tell you which ones are least likely to be imaginative. Otherwise most artists, like any other artisans, rely on their training, past experience and the work of others. One fellow that I had the misfortune of associating with once ripped a page from a interior decorating magazine to send to his client with a bill for his design fee... $3,000. Mostly it's a monkey-see-monkey-do world but yes, it takes a trained eye to spot the difference sometimes.
| 4:40 am on Jul 13, 2012 (gmt 0)|
|What is not significant is coders deciding that good design can be defined by a bunch of scripts. |
|@Leosghost being an artist doesn't make one creative. |
Only because the term artist, is bandied about so lightly, it has become debased, everyone claims to be one..
|If you are blessed with creativity then you may be lucky or you may be cursed because real creativity is not something that surfaces only when summoned. |
Agreed :) but as it is always there , "summoning" is not needed..
|Regarding my comment about its relationship to culture, school teachers catering for multinationals can tell you which ones are least likely to be imaginative. |
Agreed :) however there are many teachers who cannot do, therefore they teach, it is a way to hide from the lack of talent in oneself, it is however respectable if done to pay the bills and buy the materials..but should always be considered a temporary sentence..
|Otherwise most artists, like any other artisans, |
You confuse artisans and artists..an artist creates,( using head , hands, and heart ) an artisan makes..using head and hands only..sometimes only hands..
|rely on their training, past experience and the work of others. |
You can't train talent into someone , only show techniques, that may, or may not, be useful.. we are all influenced by our past experiences, life and art could not be otherwise..the works of others merely show what others have done, not what you should do..
|it takes a trained eye to spot the difference sometimes. |
So those who would have you pay them, to train you, would have you believe..training is not required..and is even to be avoided, amongst other reasons, so as to avoid mimicry..
The things that influence the reactions of people to colour, light, space , form , design ( even of webpages ;-) would surprise, shock and in some cases maybe horrify people if they knew why they react the ways in which they do..;-))
edited for speeling
[edited by: Leosghost at 4:49 am (utc) on Jul 13, 2012]
| 4:43 am on Jul 13, 2012 (gmt 0)|
@kendo That's kind of my point - in art "black on black" is considered acceptable maybe. Online, if we're talking user-experience, that's just not the case. This would with a great degree of objectivity be considered bad user-experience whether it's significant artistically or not. And algorithmically, those are certainly the kinds of decisions that I think it would be valid for Google to make a judgement on.
| 7:31 am on Jul 13, 2012 (gmt 0)|
Here's my take on it....
1) Google gathers user metrics via the browser (possibly in conjunction with data from ISP's etc.) as people visit and use your site (the sort of metrics you see in Google Analytics - I'm not saying they use Google Analytics, I'm just mentioning that to demonstrate the sort of user metrics I'm referring to). They do this for a period of weeks (the period between Panda refreshes). This tells them where the 'potentially' low quality pages are.
2) As googlebot crawls your site it gathers information that helps it make sense of the user metrics. For example, some pages will have low time on page and high bounce rates but if the page provides the perfect answer very quickly that is a good page. So it needs to see if the content (text, image, video, etc.) explains the user metrics and all this combined together tells Google if a page is high or low quality.
3) Having identified the low quality pages of your site, they then do a massive crawl just before a Panda refresh to see if the low quality pages still exist on your site, and if there is a high proportion of them still in existence at that point, a demotion factor is calculated for the low quality areas of your site and, to a lesser extent, areas of your site that link to the low quality pages.
I have a theory that you could get a page to the top of Google with just a few words on it and terrible design, IF those few words were so astounding and intriguing that it created superb engagement.
For example, if I created a website about the solution to Panda and the home page had nothing except one sentence in the middle that said "CLICK HERE FOR THE SOLUTION TO PANDA" that would be a pretty engaging home page for people looking for a Panda solution. If the website really did contain the solution, it would then get links, likes and shares like crazy. The whole website could be ugly, but if it contained the solution to Panda then human reaction to it (user metrics) would tell Google all it needs to know.
I'm with Kendo on this one. You can't get a machine to think and make all the decisions and judgements a human does in a split second based on our entire life's experience. How can a machine possibly learn at the rate humans do. A machine can only experience the web - humans experience everything in the world and it all affects our thinking. That thinking changes every day as well. How can a machine hope to keep up with that.
| 8:20 am on Jul 13, 2012 (gmt 0)|
|I have a theory that Panda is actually evaluating screenshots of our pages and comparing them on a broad scale. |
Then how come a scraper, and I mean 100% scraper, code, design, images, text, the lot, can rank higher than the original and for many instances push the original out of both the regular and image SERPs?
DMCA? Waste of time, a .eu on an Italian server!
| 8:38 am on Jul 13, 2012 (gmt 0)|
I think all of our web pages are essentially just a series of screenshots and any area that is repeated across all pages is ignored. Can't prove it, but it feels right.
| 8:41 am on Jul 13, 2012 (gmt 0)|
|the sort of metrics you see in Google Analytics |
Would it be realistic, then, to think they are tracking mouse movements and clicks, creating heatmaps for every page? Or would that be too much data?
I recently revamped an old, rightfully Pandalized site. In Analytics, stats show pages/visit up by 45%, visit duration by nearly 300%, bounce rate -30%. Very anxious to see if this will have an effect in the next data refresh (since it's now a low-traffic site, I have my doubts).
| 8:52 am on Jul 13, 2012 (gmt 0)|
but what about window sizes?
if a Chrome user comes along and sees 20% of the screen taken up with ads, google might downgrade the site. but the very next user might come along with a bigger window and only see 5% taken up with ads, and not even notice them. so it has been downgraded for nothing.
and its not just the size of the window either, its all the browser furniture too. some people will have nothing, and others will have 3 rows of bookmarks.
the actual amount of screen real estate that is available to each user probably differs by a huge amount.
| 9:22 am on Jul 13, 2012 (gmt 0)|
@londrum if they're smart enough, they could also decide to prioritise different results based on your screen size. Don't think there's any of that going on at all, but it would be possible.
| 9:38 am on Jul 13, 2012 (gmt 0)|
@rango - I see sites ranked differently depending on the device and browser I use so I think it is going on already. My site ranks lower on my iPhone and I don't (yet) have a mobile site so my site is slow on iPhone, quicker on a laptop using IE and quicker still using Chrome. It makes sense that Google would rank my site lower on a mobile device if my competitors have mobile sites that are faster and nicer to use than my full site (or their full versions are quicker and nicer to use).
| 9:53 am on Jul 13, 2012 (gmt 0)|
Interesting. I haven't really looked that closely at differences between results based on device. Well, last time I tried about a year ago it was disappointingly not very good at picking out mobile targeted sites.
| 6:58 pm on Jul 13, 2012 (gmt 0)|
Well let's hope they don't try to do it. Since posting in this thread and discussing it with my partner (another professional artist for 40 years) I do fear that someone will try it... because they know no better and will theorise that it is possible.
| 7:07 pm on Jul 13, 2012 (gmt 0)|
do you ever think that google are trying to analyse too much?
i mean, most people can tell within 5 seconds whether a site is good or rubbish, and they don't take into account a tenth of the stuff that google do.
in my mind, a good site is good, and who cares what percentage of words you can see above the fold. if it's good im not going to dump it just because its 15% instead of 10.
sometimes i think of google as a crazy cook. everytime they come across a new vegetable they have to chop it up and add it to the recipe. they cant help themselves. every bit of food gets a pinch or two in the bowl, hoping that it's going to make it taste a little better.
| 10:17 pm on Jul 13, 2012 (gmt 0)|
page rank was replaced by some UX stats analysis, so just like before Google was looking for number of links and their quality, now they check usability stats, because they can tell you everything about a website.
| 12:54 am on Jul 14, 2012 (gmt 0)|
|in my mind, a good site is good |
I have been reviewing sites for more than a decade, hundreds of new sites every year, and from my experience that task cannot be performed by a bunch of scripts, even when referred to as an "algorithm".
| 1:19 am on Jul 14, 2012 (gmt 0)|
Kendo, if you don't believe algorithms can decide what sites are good, then do you actually ever use any search engines?
What do you actually suggest as an alternative to algorithms?
| 1:50 am on Jul 14, 2012 (gmt 0)|
I would say if anything, ATF update is associated with analysing screenshots, rather than Panda.
| 5:12 pm on Jul 14, 2012 (gmt 0)|
Just to throw an observation in here. I watch a 4 year-old site that is largely Flash based. The screenshot shown in SERPs is horrible - a huge white space (where the Flash application would be) a very basic header graphic, an advertising banner and lower down, some text in a very old-skool/fluid width format and not very attractive design.
But, because of the Flash content, the site holds the attention and it has gone from strength to strength through both Panda and Penguin, pretty much now one of the Top 3 dominant sites in it's relatively competitive sector. Graphically and design-wise, it's got "not updated since 1994" written all over it so if Google looks at the screenshot, I can't see how it would have anything more than a very small impact in the grand scheme of things.
| 2:49 pm on Jul 15, 2012 (gmt 0)|
|if you don't believe algorithms can decide what sites are good, then do you actually ever use any search engines? |
I don't say that algorithms don't work. In this thread I am saying that using an algorithm to judge good/bad design is a ridiculous concept, especially when they have so much trouble providing good results based on matches to a simple search string.
Let them fix that part first before pontificating about the definition of art.
|if a Chrome user comes along and sees 20% of the screen taken up with ads, Google might downgrade the site |
And just how would they decide what percentage is taken up by ads? Would it be the percentage of window area used by images in proportion to the average user's widescreen or the percentage of image area that is hyperlinked? Would you count internal links as advertising links? What about when they use a redirection script that may just resolve to another internal link?
Say I have a software site and load the pages with all sorts of ads that instead of pointing to outside links, point to the top selling software lines which might yield 20-70% of their sale price?
Let's face it, the only way a search engine like Google is going to know if an image points to an ad is if the hyperlink points to Google or another known affiliate marketeer!
| 11:34 pm on Jul 15, 2012 (gmt 0)|
People searching in Google are 99.99% of the time not looking for "art". They're just looking for a user-friendly design. Calling that art is in my opinion just ludicrous.
Since user-friendly design is often largely about following standard practices, I feel this actually is something that Google has a good shot at detecting. Particularly if they use humans to first work out the design patterns that people consider to be good.
Just the same way a user makes up her mind within a second of viewing the page whether it's for them or not, an algorithm could come pretty close to making similar decisions.
Sure, it can't be the only factor when deciding to rank a site or not, but at certain thresholds I think it can probably be quite a useful one for them, particularly when the judgement is backed up by other user-activity data they will have gathered.
| 10:21 pm on Jul 16, 2012 (gmt 0)|
|DMCA? Waste of time, a .eu on an Italian server! |
Isn't removal from the US based search engines a big part of that exercise?