Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google, and Engagement and User Experience in different niches?

         

raseone

8:18 pm on Dec 14, 2015 (gmt 0)



Mods note: split off from December Google Updates thread


The "engagement" of a user or the nature of a "conversion" is very, very different from one site to the next especially when considering sites across the vast variety of the web. Satisfaction or a good "user experience" is a wildly subjective topic.

For example:

A site that offers downloads of fonts or mp3s may accomplish it's mission in a few seconds with a single page view while a site that offers long video content has most likely failed to satisfy the viewer if they have only stayed for a few seconds.

A site that is selling something can also achieve a sale within a single page view or within just a few seconds in some cases. That site may also have achieved it's goal simply by getting an email from a visitor. Google does not necessarily have any way to know if any of this has happened.

A company site that lists distributors where their products are available may achieve success by having the visitor leave immediately to visit said distributor or leave to visit the companies social network presence.

A business that sells a service may achieve success simply by having a visitor to their site call them on the phone which may only take a second or two. Such a business may be using a third party or secondary URL for sales, scheduling, regional service, etc... so leaving the site almost immediately would be a success.

Some sites may be "sticky" an encourage repeated visits over long periods of time. Some sites are just not meant to be sticky at all. They can provide everything a user would ever want from them in a single short visit. This is a good thing, not a bad quality signal.

Google can not tell if your users are satisfied or not. They can only guess.

You must all realize that every time you modify your site & think That Google has rewarded you for your efforts that someone else has made the same improvements & been penalized for their efforts. If you did nothing at all to your sites all the same things would happen to your traffic and conversions. Your rollercoaster ride on Google does not come with a steering wheel or a brake pedal.

You are trying to discover the pattern and behavior of something that is random and senseless. There is no answer for you to find. You can & should make every effort to optimize your sites for the good of your users & your business but to continue thinking that that you can effect any real change on Googles treatment of your site is self torture. You are all scratching for crumbs and bouncing around within marginal amounts of change.

You know your business & your users & your site. You have common sense. You know what you are doing. Your users are not stupid. Your site works fine.

Next time you get the urge to see how you're doing on Google go take a nap instead. It will do your site more good.


[edited by: Robert_Charlton at 10:27 pm (utc) on Dec 14, 2015]
[edit reason] moved from another location [/edit]

raseone

8:52 pm on Dec 14, 2015 (gmt 0)



I bore myself with my rants... Here is something practical.

1.) Sites that predate the panda/penguin madness seem to have a lead weight attached. They can not recover in the vast majority of cases.

2.) Post Panda/Penguin sites seem to be in a different situation. Here is what I have done to achieve success with post Panda/Penguin sites.

2a.) Build a good site with valuable content.

2b.) Ignore Google guidelines & use common sense to build fast, usable sites.

2c.) Do not add no-follow tags or 301 redirects or any esoteric voodoo in your htaccess files.

2d.) Do not use the disavow tool or pay any other attention at all to who is linking to you. You can't control that & shouldn't try.

2e.) Do not use any of Googles partners like "Reach Local" or others who create fake blogs and articles to generate fake authority & buzz.

2f.) Use Bing and other search engines to help gauge actual search optimization & response.

2g.) Do not use Google analytics. Use an alternative with literal, sensible stats rather than the abstract, subjective stats in GA.

2h.) Do not engage in any gratuitous link building no matter how "white-hat" you think it is and no matter how many times Google has urged you to invest time in trying to offset the boogie-man of "bad links".

2i.) Stay out of the Google Webmaster Forum.

2j.) Don't take my advice or that of anyone else in this forum. When you need help ask specific questions with specific answers. Use your own judgement.

Just publish & don't be a jerk. Treat your users well, build a good, efficient site. Do something good for the world & ignore Google. Nothing else makes any sense. Don't let Google get away with all these convenient excuses to hide information, opportunity & connections from people.

martinibuster

12:55 am on Dec 15, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



A site that offers downloads of fonts or mp3s may accomplish it's mission in a few seconds with a single page view while a site that offers long video content has most likely failed to satisfy the viewer if they have only stayed for a few seconds.


That point and your other points are reasonable. But you are missing a part of the puzzle and it's causing a misunderstanding about how algorithms work. It's not your fault, there is a lot of misinformation out there so I can understand you reached your reasonable and logical conclusions based on the incomplete information you have received.

When we discuss things like CTR and dwell time, what's being referenced is a variety of algorithms that do many different things. I'll discuss that at another point. There is an even more important point you have not been told about. Here is the part that is missing:

It's not how long a user stays on a site that matters. All of your points are about how long people stay on a site and the reasons why they stay or leave fast or long. None of that matters.

What matters is what users do when they return back to the search engine.

Now here's another point you missed. For most of the CTR type algorithms what they are measuring is not how bad a site is. What Google is measuring is how bad the algorithm was.

There is more to be said but the above addresses and puts to bed all of the concerns you addressed in the first post.

Good luck!

mb

Robert Charlton

1:37 am on Dec 15, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



It's not how long a user stays on a site that matters. What matters is what users do if they return back to the search engine.
martinbuster, that's very true as far as you've taken it, and I suspect your succinct answer is an oversimplification.

I'm making an educated guess that Google's algorithms are niche specific, which you probably do as well... and what users do if they return back to the search engine probably varies quite a bit in certain niches.

A bounce back to the serps has long been considered by some to be a noisy signal. A bounce back to the serps and then looking at other results for the same search might be interpreted as a more concrete sign of dissatisfaction... but in areas involving, say, research on quick reference sites, searchers might routinely compare results on several sites. (This too is simply a succinct answer for the moment.)

raseone... you've done a good job at identifying many of the ambiguities and problems an engine might have in trusting many apparently obvious signals. It's worth noting, though, that the behaviors many SEOs assume might be the ranking signals only serve as clues to Google to help identify certain kinds of page characteristics, which Google then uses to formulate heuristics, which are then checked and rechecked, built into test algorithms, and then go through many levels of trial and error calibration and statistical analysis before they're folded into either Panda or the core algorithm.

In general, I'd say that Google is much more aware than we are what signals are noisy and what signals aren't... and they do not always get it right... but I really wouldn't assume that these methods are random or stupid.

raseone

1:44 am on Dec 15, 2015 (gmt 0)



@MartiniBuster

A valid reply. Thank you. I like a guy who likes a drink....but quite presumptuous to assume that you know what I have been told or what I have learned. My information is not from third party articles full of speculation about what Google's algorithms MIGHT be doing. It is based on a combination of Google's own claims & my own 2 decades in the business.

The few examples I shared are obviously just a tiny fraction of the huge number of subjective issues that Google claims to be able to measure and thus include in it's ranking factors. It hardly matters really since Google's claim that all this madness on their search pages is quality related is a total lie anyway.

Nice of you to tell yourself that all of my concerns have now been put to bed. You're wrong but I'm glad you feel satisfied.

To address your opinion...

If a searcher gets my site as a #1 result, looks at it for a while the goes right back to Google and searches the same thing... this is still not a clear signal of bad quality. When offering a digital product someone may visit the legitimate source then immediately return to Google to search for a free pirate copy. The searcher may simply be performing due diligence before finalizing a purchase. The searcher may have gotten what they wanted then returned to Google to find more or different additional offerings having been perfectly satisfied with their experience. Not only is it very easy to poke holes in the theory that this is a bad quality signal but it is flat out WRONG of Google to equate piracy with a positive user experience.

It takes a lot of mental gymnastics to justify modern Google results with issues of quality & one of the requirements is that you accept that piracy = better user experience. Another thing you would have to accept is that big retail is better than little retail. Anther thing you would have to accept is that 90% of every site ever made was suddenly deemed "low quality". Another thing you would have to accept is that YEARS of effort on the part of all the effected webmasters has continued to fail at producing the necessary "quality" to get back in Google's good graces.

Sorry dude. I'm sure you're very smart & a nice guy but if you can't accept that the Google "search engine" is totally rigged & unfair then you will never be able to move on to the next step.

Google themselves told the U.S. senate that they are "not a search engine".

Google themselves admitted that they believe people only see credibility & authority in "big brand names"

Google themselves have stated that they believe that searchers do not want answers to their queries, they want to be told what to do.

Google first tried to pass off Panda/Penguin as "Anti-Spam" measures, then switched to a story that it was about "quality & user experience"

Google pretended that they had not laid waste to zillions of quality sites and allowed the TCs in their webmaster forum to guide unknown thousands of hard working people on a very expensive wild-goose-chase for YEARS.

Facebook restricted the organic reach of their pages very similarly to Google & started charging for that reach. 2 critical differences... They were honest about it & the price of buying the reach back is reasonable.

One time I posted in the Google webmaster forum complaining about my sites rank and the post actually outranked my site. GTFOH.

Robert Charlton

1:57 am on Dec 15, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



martinibuster... we apparently overlapped in our posting. One comment of yours I liked particularly....
Now here's another point you missed. For most of the CTR type algorithms what they are measuring is not how bad a site is. What Google is measuring is how bad the algorithm was.

Yes, the algorithm gets increasingly refined over time, kind of a winnowing process. As I'd mentioned in the Penguin 3 thread (and it doesn't matter, imo, whether Penguin or Panda here... the mathematical approach is similar)....

According to Google: Penguin 3.0 is continuing
Dec 1, 2014
https://www.webmasterworld.com/google/4719313.htm [webmasterworld.com]

...My own speculations here: I'm thinking that the algorithm may be highly "recursive"... with the same or related processes repeated on the results of the previous operations, giving us results that are increasingly refined. There's likely a pause to check results at every step, so Google can gauge whether the algorithm is working as anticipated and decide what to do next.

[edited by: Robert_Charlton at 1:58 am (utc) on Dec 15, 2015]

martinibuster

1:57 am on Dec 15, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



If a searcher gets my site as a #1 result, looks at it for a while the goes right back to Google and searches the same thing... this is still not a clear signal of bad quality.


You're right. And you're also still short of information. First, you are still insisting that the algorithms are trying to understand what's wrong with your site ("a clear signal of bad quality"). I will repeat what I wrote above: They are not looking for signals of bad quality on your site. They are looking for poor performance of the algorithm and using the click logs to train their algorithms to do it better. So please stop insisting that user engagement metrics are being used as a quality signal because they're not.

The other piece of information you are short of is that one visitor to one site is a single data point. The algorithms are looking at thousands and even millions of search transactions in order to perceive that something is wrong with the algorithm (not with your site). So even if they do take action it's a scaled action that is at the algorithm level, not at the individual site level.

Yes there are algorithms that look at CTR to identify lower ranked sites that satisfies users. Here's the part that doesn't get discussed: These algorithms can also emulate a site visitor and predict which lower ranked sites will satisfy a user. So there doesn't even have to be click log because after a certain amount of data is collected (like in the thousands and millions of search transactions) they can predict it. Microsoft has this and so does Google. But those aren't real-time algorithms and they aren't used as ranking factors.

The algorithm process you presented does not happen in real life.

[edited by: martinibuster at 2:02 am (utc) on Dec 15, 2015]

raseone

1:58 am on Dec 15, 2015 (gmt 0)



@Robert Charlton

I don't really assume that its totally random or stupid. I assume that it is so rigged & self-serving that the result is random & stupid. I'm sure Google is still motivated to have the most powerful search engine. It is however obvious to me that the results are so clouded by a combination of greed & nonsense that they are beyond unfair. It smells an awful lot like Google simply wants to charge money for what used to be free, organic reach.

If it walks like a duck & quacks like a duck I'm not waiting for the DNA test. I don't actually care if its a scam or a mistake. When combined with Googles near monopoly on search it is a serious problem for the searchers as well as the publishers.

If I search for Zelda I don't need Google sending me the message that the best thing to do is Pirate the Zelda game rom & the 2nd through 5th best things to do are also to pirate the Zelda game rom.... All I wanted was a map to the last level.

martinibuster

2:05 am on Dec 15, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



and what users do if they return back to the search engine probably varies quite a bit in certain niches.


It goes back to user intent. There are various kinds of queries which most of the time identifies the user intent. So while the ranking algorithm can divide things up by niche, this isn't a ranking factor we're discussing. This is something that

1. Comes after the ranking algorithm
The first one is part of a suite of algorithms that focus on re-ranking the SERPs in order to meet user intent.

2. Another one is something that isn't a part of the ranking process at all.
The second one that isn't a part of the ranking algorithm is a function of machine training the algorithm to better understand what users tend to like, which includes user intent but is also more like taking note of what large numbers of users (thousands and millions) tend to like and then adjusting the SERPs for that. There are no statistics for how often this happens so it could be rare or it could only be applied in geographic searches or for certain kinds of searches. That second one is a big unknown.

[edited by: martinibuster at 2:09 am (utc) on Dec 15, 2015]

netmeg

2:05 am on Dec 15, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Sorry dude. I'm sure you're very smart & a nice guy but if you can't accept that the Google "search engine" is totally rigged & unfair then you will never be able to move on to the next step.


There is no "fair" in search, if for no other reason than it's not possible to be fair to every single website when there's only ten links (if that) on the first page.

At any rate, we have a forum for discussing Google's business practices:

[webmasterworld.com...]

raseone

2:07 am on Dec 15, 2015 (gmt 0)



@MartiniBuster

The convolution and infinite looping of this thought process should be enough to prove that its nonsense!

So now Google is not analyzing the sites at all? It's only analyzing itself!? and none of this affects rank or results!?

It takes a special type of person to continue insisting what another person knows or does not know. You apparently have stockholm syndrome.

raseone

2:14 am on Dec 15, 2015 (gmt 0)



NO FAIR IN SEARCH!? Thats insane. Not acceptable, not true.

Fairness does not mean that everybody wins. It means that everyone gets a chance.

Content is king. Authorship matters. Ownership matters. Quality matters. Piracy is illegal.

raseone

7:32 am on Dec 15, 2015 (gmt 0)



Holy jebuss. I didn’t see these replies…Jeeeez… the presumption… now you’re kinda just pissing me off. As if some of us do not have thousands or millions or billions of our own data points from which to measure… The distraction… as if Google themselves haven’t sited these things as reasons for lost ranking.

Heres some presumption for you: You’ve got your heads so far up Google’s ass that you don’t realize that people have been ballzdeep in supposition about how this could possibly be righteous or logical for almost 5 YEARS NOW.

Do you not remember what search was like before? Have you never tried a search engine other than Google? For #*$!s sake do a side by side comparison right now! Are you nuts?

Google SERPs are clearly optimized for Google's benefit, not the consumers.

I see all the classic signs of a week argument even in the limited response here:
1.) go after the person, not the issue
2.) attack the weakest point, avoid the strongest
3.) distract & deflect
4.) Overcomplicate
5.) change the subject

If Google admitted openly that it's "search engine" had been reduced to an advertising scheme then it would lose dominance. Google maintains that all this incomprehensible madness in it's search pages is the result of some quest for better quality. How, after years of watching the results can any of you possibly believe that?

Google thinks it can outperform the publishers who feed it by offering theft as an alternative to purchase & pretending its a victimless crime…. Making Google seem cool & content creators seem greedy. Damn those dirty authors and artists & scholars, those programmers & scientists, those filthy tradesmen! Their so damn greedy thinking they could make a living by devoting their every waking hour. We’ll show them.

In the Pre-Panda/Penguin era, If you spent a lot of time thinking about Google algorithms or SEO rather than the creation of valuable content this was a clear sign that you are not adding anything of value, that there was already enough of what you were offering or even that you might be a scam artist.

Efficiency in design & communication is obvious to any good designer. SEO beyond efficiency is an attempt at manipulation... Sorta like marketing.

People don't "search", they "Google".

In the current environment even some of the best of you have been forced into the deceptive practice of trying to work the algorithms rather than focusing on what you're making. It's not because you're gullible, its because you're smart & figured you were smart enough to play the game.

Unfortunately this really does not even really qualify as a game. Games are not rigged. In games the rules are clear. This is a not a game. This is business. As a site you are a commodity. As a searcher you are a commodity.

To Google you are food.

If you are not good for a dollar, you might be good for a click. This is the equation that google uses to value content, site & consumer alike. Perhaps Google is correct that everything they can reach is theirs to exploit. That is the law of nature. I'm not saying they are incorrect. I'm saying they're wrong.

Quality to the consumer & quality to Google are two very different things. Google is seen as an authority on authority, a judge of judgement, a creditor of credibility. The average searcher has no idea what direction they have been steered into or away from. That frog just keeps sitting in the pan as the heat slowly rises.

Maybe data just can't be owned. Maybe only the access can really be monetized. Maybe Google realized this a long time ago. Maybe none of you that argue with me really make anything. Maybe you just work to manipulate the access to other people's work. If that were the case then suddenly I would concede that you are right & I am wrong.

I can only wonder...
What will all of those who work to manipulate access to data will do... when those of us who would create data are all busy cleaning your toilets. How many generations of these parasites can survive on our carcass?

On that note… I claim total victory. I win. I’m a smarty pants. I have it all figured out. Thats why I have time to sit in a forum picking though Google’s poop looking for peanuts.

hasek747

7:51 am on Dec 15, 2015 (gmt 0)

10+ Year Member



@raseone

You're being very rude (not sure if accidentally or intentionally), I doubt anyone will be continuing the discussion with you further. Go somewhere else maybe.

netmeg

1:29 pm on Dec 15, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



(I love it when new people come through and think they're gonna school the rest of us on Google and SEO)

Thinking maybe this "discussion" has already outlived its usefulness in this forum.

raseone

5:09 pm on Dec 15, 2015 (gmt 0)



I'm not new. My sites are older than Google.

I'm not trying to "school" anyone. I don't care if anyone replies. I've said my piece.

Politeness is not required but I'll give it when I get it. Rudeness comes in many forms. Mine is just simpler & not hidden in a passive-agressive superiority complex.

SEO is 90% garbage.

If you refuse to admit what you're up against you can't possibly hope to deal with it.

I stand by everything I've said. Study the algorithms all you want if you feel like thats a better thing to do than producing quality content. Perform whatever mental gymnastics you need to to justify Googles behavior if that makes you feel better. Stay in denial. Pretend it's about quality & that its not just the monetizing of "organic" reach. I'm sure that strategy will eventually work again if you just wait for one more algorithm update.

Robert Charlton

9:07 pm on Dec 15, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I think we'll lock it here.