|What is your best question about Google SEO?|
Often I find myself picking up a profitable insight simply by listening to a good question like "After removing my user generated content, I ran it through a keyword density tool to identify themes for new articles to take the place of the removed ugc, but now I don't know how to handle quality control when outsourcing so many article ideas at once?"
Other times I can identify people to pay less attention to by listening to their less than impressive questions like "Why don't I rank #1 after stuffing meta keywords?".
I am curious what is your best question about Google organic search?
Let's try to keep the noise down on this thread and skip the silly questions :)
First off, thanks for supporting this. Here are the questions, relieved from their cumbersome thread. Please edit or delete where appropriate!
Regarding Webspam, I might ask:
If our analysis is accurate, naturally based on only a small number of total sites, we see a sharp and sudden decline in UGC traffic. Given the 'free-speech' nature of UGC it cannot be analysed under the same umbrella as 'editorial' content. Is there a case of some sites perhaps being mis-identified?
With the apparent 'confusion' created around requiring webmasters to 'nofollow' a lot of links (to play it safe in light of so many penalizations). How useful is link data, can it be trusted at all now that the organic reality has been so dramatically skewed?
I believe 'Negative SEO/toxic links/disavowing' should have been something vehemently guarded against. (The concept of any link or relationship as being 'negative' rather than 'ignored') By allowing SITE B to hurt SITE A - control was taken away from whitehat players. It has now clearly resulted in a new, more powerful way, to game the playing field. Why did this happen?
The latest slew of updates has brought a great deal of webmasters out of the 'woodwork' in terms of SEO. Assuming the primary function of a search engine is to reduce the effect 'players' have, and reward those who are 'focused on their content, site, products, users' - how does this benefit the internet in general?
There's a great deal (thousands) of whitehat webmasters who are being penalised to non-existence (or the effective equivalent). Sites that have demonstrated good practice, satisfied audiences and neither betrayed or mislead users. This is unacceptable, what can be done about it?
Following on from a 'Webspam' statement confirming 'brands are not favoured': High levels of instability favour only those with 'deep pockets' or short-term-goals. Given the current, exceptionally unusual, and well documented turmoil hitting some 'whitehat webmasters', isn't that indirectly favouring 'brands' and 'blackhat'?
While it's understood that age shouldn't be a ranking factor. Given the enormous amount of random links older sites can *unwittingly* accrue, they are far more likely to be hit by penalties. Has this been properly factored in?
Specific to a small group only: On November 16-19 2012, an unknown update (Panda 21.5 ghost!) caused a lot of forums to begin experiencing a sharp then sustained drop in traffic. Some of these were sites that otherwise benefitted from Panda, and most 6/12/15+ years old. The penalisation did not (in some cases) match the quality, level, or sophistication of the sites in question. A lot were previously considered 'authorities'. Can anything be said about that update?
[edited by: hitchhiker at 11:33 pm (utc) on Jun 16, 2013]
The disavow tool has resulted in very few recoveries for webmasters even when following the Webspam Teams advice on disavowing domains. It has proved especially ineffective for automated penalisation such as penguin. Has disavow been implemented into the SERPS and if so why have there been so few recoveries?
|The disavow tool has resulted in very few recoveries for webmasters even when following the Webspam Teams advice on disavowing domains. It has proved especially ineffective for automated penalisation such as penguin. Has disavow been implemented into the SERPS and if so why have there been so few recoveries? |
+++ for this one.
In the past algorithmic changes Google has made generally did not affect websites on an entire level. So when a change was pushed forward that the Google Team determined was a quality win for the SERPs, while it would change traffic on websites, it generally did not kill off websites whether they were great websites, mediocre websites, or spammy websites. With Panda, Penguin, and the lesser known Ghost update, it seems like algorithmic changes lately affect a website on an entire level. This appears to have been very successful against spammy websites in the short term (killing them off), and an overall quality win for Google. However, for websites that fall on the wrong side in a grey zone, it is very devastating, and probably not Google's intentions for those websites. These good websites that fall in the grey zone are being killed along with the sites that are really spammy in nature. I don't see how that could possibly be good for Google, or the Internet in general.
Is it possible that communication could be opened up in Google's Webmaster Tools to help websites that are in the grey zone for Panda, Penguin, or Ghost?
Either providing them with some sort of additional information to help that site get out of the grey zone, or if you are afraid spammy websites could use this to their advantage, allow sites like this to be looked at by the Google Spam team and if they determine the site does meet Google's quality standards, then give them some real advice on what they are doing wrong that is putting them in that grey zone to help them get out of it. That, in my opinion, would be a quality win for both Google and website owners who are running good websites.
If google wish to reward unique and diverse content why are so many local service SERPS now dominated by not one, but several directories?
Since many people use Google "as/instead" of a directory to find local services, why promote so many directories first?
If someone used Google to make a search and the first several results were links to OTHER search engines would this be a good user experience?
An addition taken from this thread [webmasterworld.com...] (in the post by chrisv1963) that I would like to add here:
|Can't Google simply send a message in Webmaster Tools when there's a manual action against your website? It would avoid a lot of unnecessary reconsideration requests. |
In an attempt to move the thread away from noise.
We should understand that "quality", like "beauty", is in the eye of the beholder. So, it's likely that Google doesn't have hard guidelines on what exactly defines a quality site.
How many quality links should it take to offset unsolicited sub-par links?