Was just looking at Matt's twitter feed/conversation, boy he sure does take a beating with regards to negative comments.
Searched my sites for a "small" one to submit but I don't have any, maybe I'll build out a little .org or something just to submit.
There is no need to be logged in to Google to submit to this form so anyone can do it. I am wondering how will they weed through the mass of information they may get, some of it genuine opinions, some of it reporting anything (e.g. a small sites of varous quality in non-competing niches) just for trying to see what happens.
Or in short - can they really trust this data they collect?
I haven't done it purely because I think as a webmaster my opinion is completely skewed - of course I think my sites should do better. As does every other webmaster of a 'small' site (whatever that is) on the planet. The algorithmic penalties dished out to my two biggest in late July indicate that I've bigger problems than a 'should do better' mindset to worry about anyway, it seems.
If I had no shame, I'd tweet out to my sites' followers for them to fill it in but I'd rather not involve my loyal users in how badly things have gone lately with regards to the sites' ranks.
So that leaves me not filling it in. But not because I'm skeptical of Google's motives as a hell of a lot of people clearly are.
Big site or small site, why should it make a difference?
What Google needs to do is fix their mangled algorithm because its killing sites that deserve to rate better... sites that rated better 10 years ago and rightfully so.
For example we have one site that was the first of its kind and it has the most related content and it has the oldest domain age. It also has more than 50,000 real backlinks.
But Google prefers to list newbie and much less related sites before ours, sites with one related page and a single title and h1 tag.
I cannot think of any site that does not deserve to be number one for its keywords more than this one.
[edited by: Robert_Charlton at 2:09 am (utc) on Aug 29, 2013]
[edit reason] removed off-topic, anti-google rant [/edit]
Here's the Google docs feedback form.... (two fields to fill in... the first one required)...
Small website survey
|Google would like to hear feedback about small but high-quality websites that could do better in our search results. To be clear, we're just collecting feedback at this point; for example, don't expect this survey to affect any site's ranking. |
What's a small website that you think should be ranking higher in Google?
Why does this website deserve to outrank the current websites in Google's search results? What makes this website better?
I think that the questions for the second form field are worth thinking about a lot before submitting the form.
IMO, Google is looking for heuristic (trial and error) type ways of further refining its quality evaluations, along the lines of its earlier example Panda seed questions, which we started discussing here back in May 2011 (and haven't stopped discussing since)....
Quality According to Google - Official "Guidance" on Panda Update
Here are some of the original questions, touching on qualities that might be at issue now....
- Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
- Does the article provide original content or information, original reporting, original research, or original analysis?
- Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites donít get as much attention or care?
- Does the page provide substantial value when compared to other pages in search results?
- How much quality control is done on content?
- Does this article contain insightful analysis or interesting information that is beyond obvious?
I assume that Google is wanting to identify the kinds of factors they've perhaps missed over the past several years, or to find cases where it hasn't made correct distinctions. I don't think that Google is going to be looking at this feedback on a site by site basis, but it will be classifying sites and reasons and identifying patterns of data, perhaps to reexamine reported sites on an algorithmic basis.
Claims of originality are also likely to enter into this. At the least, even if Google doesn't rerank sites because of this input, it may use the feedback to better explain what it's looking for.
Looks like another . hey guys help us out. Well I guess we have all done that in the past. Now .. do you want some low paid reviewer looking at your site ?
Until googleguy posts here I'm opting out out of everything. Feedback just seems to be one way.
They have billions of dollars, how much do you have as a percentage of that ?
I've always had a problem with the Panda guidelines as my sites don't really have 'articles'. They have visual things. Can't really say what as it goes against the TOS of the forum but essentially the content is in the visuals and even marked up with all the microformat stuff will I guess never convince Panda of quality.
Instead of simply complaining that Google hates independent Web sites and is trying to boost "big brands," we have an opportunity to give examples of topnotch sites that may have fallen between the cracks of Panda, Penguin, or whatever. How is that not a good thing?
If Google really wants to help out main street and small business, what they need to do is turn the knobs and reduce the rankings of the amazon, walmart, ebay and other corporate giants that dominate the top of the results and let the small business rise.
Google used to be for the little guy, lately it's all about the big corporations dominating the search results and the recent posts from senior members here reflect that.
trinorth - I think that's why they asked for feedback about small sites in particular.
Yes, this is an opportunity to provide examples of the good sites... not necessarily your own... that have dropped down in serps, and to explain your reasons.
I don't think you can simply grumble about the results without suggesting alternatives and giving reasons.
While we can't talk about specific sites here... I'm curious about what kinds of reasons members would give about preferring some of the smaller sites that have dropped at the expense of larger ones. What would you put into the second field on that feedback form?
Very simple, if I want to shop on amazon, I go there and do my search. If I want to shop elsewhere, I would like to go to google and not see amazon results. Amazon may sell a lot of products, but small businesses specialize in custom products or variations of products amazon does not offer.Currently to find those results I have to go to bing these days.
I submitted one of mine; I have six sites that have the same architecture but different content related to their target location; Google loves the eff outa five of them, but sixth can't get no traction.
And that's my reason for thinking it's an anomaly - G loves five, ignores one.
(I'm pretty sure I know why it happened, but it's past time to start showing up for searches)
Remember when Wikipedia dominated all the top results for the searches? These big corporation results are very similar and seeing people complain about them reminds me of that time. Maybe google will do something about them like they did with the wiki results.
Crowdsource the SERP's. It works for American Idol, why not Google?
|Remember when Wikipedia dominated results |
This was a typical example of manipulation from inside. Similar favouritism was applied to other "how do you" sites that only have a title and H1 tag related to the keyword string.
Dunno how Stack Overflow does it but I can be searching for a solution to a problem for days and then go to SO to post the question and it appears at the top of searches within minutes... with only a partial match.
Now the question is - What's the max visits to be considered small website?
Two options -
* If you like being paranoid: Google need a work for their raters. So, they need us to help them find more sites to screw.
* If you believe Google really wants to improve its search results: Submit your own site hoping that one day your feedback will help them do a better job.
I think this is a good move - at least Google is paying some time to the issue that has been raised for some time about brand bias in the SERP's.
The one thing Panda/other penalties have done in abundance is eliminate the resources, and often the confidence of many small business' necessary to reinvest in better UI/UX/content - I'm not sure this will address the business side of things. Google needs to become more responsive in rewarding efforts made by small webmasters trying to get back in the saddle; and more aware of how penalties effect small business' in the real World, that will not be able to remain sustainable when their primary channel for commerce is destroyed . Such a U turn in tolerance is to some degree Google's shared responsibility because it allowed it to occur, and by default encouraged it ( despite lip service / and guidelines )
I'd encourage Google to not only look at why some sites may be overlooked in rankings, but also look at how it can encourage small webmasters competing with the big boys, to play at an even level, if they make efforts, acknowledging how difficult it is to do that without even-handed support. Set some better guidelines for quality, and better responsiveness in rewarding it and allow sites to break through the brand bias. Somehow encourage, perhaps through WMT. It's not just about penalizing sites, if the web is to have a chance of new, vibrant life en-masse.
Closing thought ..... at least there are some efforts being communicated, and Google also deserves encouragement +1.
|I think that the questions for the second form field are worth thinking about a lot before submitting the form. |
Had the chance to think about this overnight and I think it's a pretty neat experiment. No longer can a website owner complain that their site should be doing better.
Here, in this simple little google docs form, is your chance to SELL your site to google, here is your chance to show google WHY your site should be #1...
I do hope they make the submissions they receive public, should be some fun reading!
|Martin Ice Web|
The question is:
google has more data than we have, they know which website lost rankings through panda. It would be easy to pick 10.000 out of them and look if panda algo is wrong. Why should we submit our harmed sites?
Next thing is, that this form indirect declares that panda is a penalty for the whole website lese they would have asked for keywords for the domain that is in question to be ranked higher than others.
There's a lot of information in this forum and other forums for Google to analyze. I'm sure someone with some authority to discuss future algorithm changes has seen all the complaints about stacked Amazon, eBay and other big branded listings in the serps. It's almost as if the algorithm gives you +5 points, and possibly host crowding benefits, if a stock symbol can be associated to your company.
This is a very, very hostile environment for small businesses in Google. I could spend weeks filling out Google's form explaining why some sites should rank above others. And I will spend an hour or two doing this, but I'm not holding my breath.
I've long believed that the reason why Google posts these feedback forms is to give disgruntled webmasters a chance to vent. After venting, the edge is taken off and some of these webmasters will relax a bit in hopes that Google will actually read what they submitted and act on it. In general, past history has shown me that soliciting feedback from webmasters in this fashion is more of a public relations move for Google than anything else.
Maybe instead of soliciting feedback from "small websites," Google employees should have first taken a good hard look at their search results. They might be amazed at the lack of small websites in many queries. But Google already knows this.
>>>Maybe instead of soliciting feedback from "small websites," Google employees should have first taken a good hard look at their search results. They might be amazed at the lack of small websites in many queries. But Google already knows this.<<<
100% agree! This just sounds like an admission that they don't know how to detect onsite quality. Sadly this is obviously true in my niches, from the code up they fail to detect good coded, well designed websites. I may post an example of a competitors site that was great and has been replaced by utter garbage DIY! I'm to paranoid to post any of my own!
Or maybe they want some actual human type feedback on potential issues. Geezopete people, sometimes a cigar is just a cigar.
The good news is, it's entirely voluntary. You don't HAVE to submit your site. You can just keep on doing what you're doing.
I filled in the form twice. Once for a site/tool/webapp I like that ranks below pages that link to it. Once for my own site which I do not think gets credit for being written by a subject expert.
I would find it easier to list the big sites that should rank lower in the SERPS.
My least favourite at the moment is a big Q & A site that often dominates the SERPS (as in half the first page) - even when the page content consists of someone asking the question I want an answer to. Google ought to be able to pick this up as thing content. Of course a blog or a specialist forum that answers my question in detail gets pushed to the bottom of the page.
Matt could save himself an awful lot of work - and the hassle he's getting - if he simply eliminated brand bias.
In the majority of commercial SERPs I'm interested in, if the half a dozen or so brands in the top places with their thin content were not there the quality of those SERPs for a searcher would increase dramatically because page 2 onwards is usually not bad at all - problem is that hardly anyone goes there.
Grasp the nettle Matt, let brands compete on the quality of what they offer for the relevant search term just like the rest of the sites and you'd have a good, relevant search engine again.
Likely they will examine what they feel are true examples of small sites with merit that aren't ranking and study out why, what signals and patterns are being missed. It's basically quality assurance, to identify points of failure and then fix it. This is the kind of thing that keeps Google ahead of the competition. Good move.
Robert Charlton's list of reasons of what gives a site merit should be printed up and tacked onto every Googler's wall.
It is almost as if things are turning full circle with Google turning into Yahoo and asking for site recommendations. Perhaps Google is finally realising that the web isn't quite the collection of academic essays that the Panda "guidelines" seems to think. Then again, it might be a prelude to Pay Per Include.
I think across the board, I have an answer to "what makes this website better".
It's because it's a niche website covering a subject in detail. The experts are people who know something inside and out. Those will be most if not all the small websites imo. The specialty sites. Isn't being an expert in a field better than a website that has 1% of their articles covering the same thing? The site covering 100% of that subject must know it better and have more in depth information.
I do think this is a good thing that Google is trying to do. What I read "small websites", I can't help but think about all those specialty/niche sites which dropped off the face of Google search. Just my opinion so who knows. It's what I've seen happen but as we know, "your results may vary".
This is really awkward for Google. It makes me less confident they know what they are doing
I run an information site that consistently gets beat out by Wikipedia. It is particularly painful because the Wikipedia editors have often used the corresponding page on site as the seed for their page. However, I have to say, in many cases due to the additional information added, my site deserves to be beat by Wikipedia (though it does bug me that my page was used as the authority site for a chunk of information on their page). However, there are many cases when the Wikipedia article is thinner than my corresponding page, yet Wikipedia still trumps me.
I think that is what webmasters have an issue with. Yes, it is reasonable for a thin page on a consistently fat site to rank well. However when a better page exists, that better page should get the extra edge. That is a harder problem to solve, though, because "better" can be too easily gamed on a page-by-page basis. It is safer for Google to return Wikipedia as #1 because their overall quality score trumps just about anyone else. I would imagine that the only way to beat it is via valid links because that is a strong indicator of quality - the people voting. Not always, but usually.
I don't really see this as "brand bias". It is more like "reputation bias". Amazon is a really good site for shopping. They have an impeccable reputation. If I had to guess which site would offer the best shopping experience for a widget, I would pick Amazon versus Widgets-R-Us.com, unless I had information that said otherwise.
| This 122 message thread spans 5 pages: 122 (  2 3 4 5 ) > > |