|7 Ways Google Search Can Improve - for the common user|
I have been thinking a lot about google recently especially when I become frustrated when I can not find the right results for my search query. I am a new father so I have been searching often for father related articles and I am finding myself frustrated at many of the results I see causing me to dig deep into the pages of google results. Here are 7 points I feel can really improve google as someone speaking who has done SEO and internet marketing for over 14 years and an every day searcher. I am sorry if this is not the appropriate place for this, feel free to move it if so.
1. Stop trying to be rid of paid links.
As long as links play a major role in the ranking of a website for a given term or terms, webmasters will buy links any way that they can. This is such a bad system because not all paid links are bad. For example, if I have a quality website, and I contact a popular blog to inform them of my website and ask to do a review, and pay them for taking the time to do so, should this really be considered a bad link? If the blog owner feels blogging about this site would add quality to its readers, and accepts a payment for its time and expedition of the review, and the review is accurate and offers something to their readers, why should this kind of thing be frowned upon? Google needs to focus on bad links, not all paid links, because when it comes down to it, most links are paid. The quality of a backlink should be much more important than how that link came to be. Which leads me to my next point.
2. Focus on devaluing bad links and link schemes.
Google has made great strides in recognizing spammy sites and removing them from the top results. However, spam sites still rank. Made for adsense still rank and thin affiliate websites still find ways to sneak through. They can better tackle this issue if they tackled suspicious backlink profiles. For example, why should a made for adsense site rank top 10 if 95%+ of its backlinks come from blog comment spam? Google needs to do a better job at recognizing this. Article submissions, directory submissions, social bookmarking is all fine if they are not making up the majority of the backlink profile. Quality links will come from a related website/webpage and be within an article that is related to that website. I see too often websites ranking high and have no backlinks coming from any websites in their same industy. Why in the world would a golf website ever want to link to a lingerie website?
3. Less brand power.
Lets face it, brands rank for anything. I dont mind big brands ranking high for their relevant keywords but it has gotten to the point where they are starting to rank for terms they should not rank for. For example if a major brand focuses on selling widgets, offers nothing for free, why should they rank #1 for free widgets and all free widget related terms? All because they have on their homepage "sign up for free". Google sees this as a major brand, and the word "free" so it decides they should rank for free widgets. This is a major flaw in googles algorithm and should be corrected to get the searchers to the most relevant website possible. If someone is looking for something that is free, there should be no shopping cart, ecommerce related websites in the results.
4. Reduce the weight that backlinks have.
Links are king. You give me the worst site with the worst content and if it has the right backlinks to it, it will outrank everything. Google has a problem with webmasters manipulating their algorithm through paid links, one way to help combat that is to not put as much weight on backlinks. Google is using the same kind of ranking system that search engines have been using since before google even existed, and its time we put more emphasis on quality rather than the backlinks a website has. Which leads into my next point.
5. Make bounce rate a stronger ranking factor.
What better way to know if you're showing relevant results to searchers than to find out whether or not they return to the SERP's after clicking on a result? If 80%+ of searchers search for something, click the top result, then click back, then obviously that number 1 result is not the best match for that search term. Let visitors determine the most relevant results, not backlinks. There has been discussion that this can be manipulated, but if google is able to tackle click fraud with adwords, why cant they come up with a stable system to determine legit bounce rates? I am not saying bounce rate should contribute heavily to the ranking factors, but a number around 30% should be effective. No 1 factor should make up more than 50% of the calculation for ranking such as backlinks has now.
6. More manual reviews.
For highly trafficed search terms, google should manually review the top 10-20 results to make sure they are returning the most relevant results. This can be especially important for ranking trends such as during the holidays to make sure popular seasonal related search terms are returning the results the searcher wants. If a visitor searches for "buy christmas tree online" they should see websites that sell christmas trees and not results that are filled with how to buy christmas trees online, or what a christmas tree is (thank you wikipedia). If a popular search phrase is showing that searchers often get to the deep pages of the results such as 4-5-6-7 etc then a flag should let google know maybe these set of SERP's need to be adjusted.
7. Don't be just another search engine.
Google is just another search engine. They just so happen to be the most popular one right now. If you do the same searches in bing or another independant search engine, you will see very similar results. Google right now is just another search engine using the same base calculations that other search engines have been doing for over 15 years. It is time they changed the game and make a drastic change, otherwise google become the next internet explorer and fox will enter the industry backed by tech savvy engineers who can draw the interest of the younger upcoming generation of tech savvy computer users.
Thanks for a very thoughtful post - great stuff.
With regard to "6. For highly trafficed search terms, google should manually review the top 10-20 results to make sure they are returning the most relevant results." I'm pretty sure this is what is going on. However, the criteria given to the army of reviewers may need some tweaking along the lines of incorrect brand dominance.
Thanks Tedster. I know they do a lot of manual reviews of websites, high traffic keyphrases etc but I think they can be doing more. I understand that is a lot to ask for, but with a company such as google with as much resources that they have, I think they can do more. Maybe I am too critical of google, but when you own the market like they do, I think they should be more critical of themselves.
Realistically speaking, if you perform the same search on bing that you do on google, the top 20 results will be similar, with usually the same sites listed just shuffled a little bit and bing will display several sites not listed on google because google slapped them with a penalty. So google being the best search engine is not a fact, it is a matter of opinion especially since there results are similar to others.
I remember when google first hit the market. Back then I used altavista and yahoo, askjeeves as well as hotbot and others. I remember I had to use all of them because I could never find what I was looking for with regularity on one engine. Then google came out with its simple design and relevant results and soon enough they were the search engine to use, and as time went on they were the only search engine to use because its competition was crap. Well now that is no longer the case. Google can release instant results, caffeine, video results, places, instant search, previews etc etc but people use google for the most relevant results and not for all of these add ons. If you ask me, all of these add ons take away from google and what made google popular to begin with. What made google popular was it was simple, direct and to the point. You went to google.com knowing you can focus on your search and not have to wait for a bunch of news stories to load (yahoo) and not have to worry about a bunch of features you never used. I am not interested in instant search, live previews etc and I think the average searcher feels the same. I know what I am looking for so stop showing me results that are not relevant to my search. If I want to search for a video about widgets, I will go to youtube or I will search for "videos about widgets", dont show me widget videos if I am not asking for them. If I want to know how to buy something online, I will include the phrase "how to" in my search.
I am more aware of googles flaws from watching my mother and my father doing a search. We as internet savvy internet consumers know how to get what we want but my parents and many of my clients do not know how to properly search for something and I thought this was their fault but I am starting to be more and more convinced that this is a problem with google.
I am trying to speak as an internet searcher rather a webmaster here. As a webmaster I would not have google change a thing because I know how to be successful with googles current system.
For clarity's sake in this discussion - how about we use the word "click-back" instead of "bounce rate" for the #5 point. That will keep any Google Analytics out of the discussion, and Iknow that's not what you were talking about.
I wonder if any search engine dares to use click-backs as a very strong signal. It's common for me to do that on some searches because I'm comparing results and I want to see more than one, not because the first result wasn't relevant. Comparison shoppers do that a lot, as well as researchers.
From my point of view which admittedly does not have access to all of Google's data, I agree that their long war on paid links has been ill-conceived and done more to hurt than help. At the same time, I do think that Google is doing a lot to develop alternate signals, especially in the area of social media and the social graph altogether.
As one member observed recently, it's beginning to look like WHO clicks on your links is becoming important, in addition to the raw number of visitors.
Yes you are correct with the click-back, that is exactly what I meant.
I appreciate googles efforts to develop alternate signals for ranking a website but that still does not dismiss how much backlinks play a role in ranking. You can have all these signals that sound cool to talk about but if the end result is not relevant results, what is the point?
You know that bing commercial about search overload? Where there a bunch of people talking about useless facts and information, I can completely relate to that commercial. I did a search for something last night and google showed me real time results and latest news which mostly featured freelance positions available which were totally not what I was looking for.
I wonder how much google invests in search behavior. Monitoring the average searcher who does not own any websites and just study their patterns and behavior. I am sure google employs behavior specialists but I think this is something that they should explore more of. I know googles web history is supposed to adjust the results taking into consideration your previous searches but right now it is not doing a great job. I would really like to see that improved.
Its also interesting to see a lot of people on here complaining about the results, but we are looking at it as webmasters so there is a lot of bias. I wonder how google goes about trying to understand the average search engine user and try to understand their patterns.
When you mention who clicks on your links, do you mean who clicks on your links from the SERP's or who clicks on your backlinks?
Back clicking would be a disastrous signal for many. We deal in contracts that always require three to five bids. Our users are always going to click back and check and others.
Eliminating non content links as carrying any value would help with paid links. Non directory, non content links have little real value most of the time.
I agree with your other points.
I'd add they should penalize those gaming the system with duplicate sites. Give them a few day blacklist and they'll get the message. I'm tired of all the brandkeyword.com domains that are part of parent sites ranking. It's spam.
|do you mean who clicks on your links from the SERP's or who clicks on your backlinks? |
I think it could be both. Google and others are building rather large social graphs. They know who is an authority, who is an influencer. The idea is this: if one of these "important" people - identified by for instance, their Twitter account or their Google toolbar - visits a site, then that could be weighted as an extra signal.
I'm not saying this is true, mind you, but it is worth thinking about. I'm pretty sure that if an "important" Twitter account retweets your link, that means something. Social signals like this are under intense study. Heck, I'm even studying them ;)
[edited by: tedster at 10:30 pm (utc) on Dec 5, 2010]
scottsonline, I completely agree that it can be disastrous. I am not talking about making it a major signal. I am talking about more on the lines of a system where over 90% of searchers that click on the result and then end up on another result, that should raise a flag to google that this site is perhaps not suited for the placement it has so it can maybe do a manual review to see how they can "algorithmically" correct the problem, or maybe move it down a few places to see if another site can prove to be more helpful. I know it raises a lot of alarms and people would be very weary to support this, but I am talking about something that would be very advanced where it would not be as simple as just a back click, and even then it would not be a major ranking factor, but this can bring to googles attention some websites that should not be ranking as high as they are. A great example is if I am searching for widgets and I see a website about medicine, the clickbacks would be very high and this should send a signal to google.
Google does have measures in place to put a stop to duplicate sites but like anything else, people find a way around it. I am actually dealing with this issue right now, this low quality site has 3 other almost identical copies ranking in the top 20 for all related search terms.
|Make bounce rate a stronger ranking factor. |
I believe onpage factors should hold the majority of how Google ranks page... So much so I bet my entire SEO future on it... IMO, the only way Google can base their ranking on is user experience and the only way to see that is through what happens once a visitor comes onto a page...
If they stay on the page and the site, to me that says a boat load... I bet wrong though because I am out and sites who scrapped my content into some form of dyslexic spinner software rank supreme, go figure..
Google should knock links out of there algo, end the insanity and monitor what truly effect user experience... I do not see how links have even the tiniest bit to do with their users. Just because a site has 1000 great related links to a page simply does not make that page valuable... If 1000 people go into a page and never leave it odds are it is pretty valuable...
I just feel that I have been searching for things online for the same way for too long. There have been no major strides taken in my opinion. Google has reduced some spam but I have always been able to get through it anyway. I find when searching for long tail phrases like a sentence with 10+ words, I can never find what I am looking for which should be the opposite since I am being very specific, but google does not want to show exact matches for my sentences because they feel those websites are trying to game their system.
I have my own methods of searching to find what I want and I am very good at it because I have been doing it pretty much every day for the last 15 or so years. We are walking around with computers in our pocket, why are we still using search engines that are stuck in the 90's?
You know how we all know that google even knows that it has a hard time displaying relevant results? Because its still displaying over 20 pages of results for even specific search terms. If I was a search engine, I would take it personally that my users have to dig through that many pages to get what they want.
Just wanted to thank you for the great post and follow-ups, along with tedster and his comments. I agree with a majority of your 7 points, and wish it currently was that way!
Time will tell.
thanks yellow_sun. Glad to know others agree with where I am coming from.
I can understand why google is afraid to make a big change. If you are #1 why change anything? I wonder if anyone will ever be able to overtake google as the #1 search engine
|1. Stop trying to be rid of paid links. |
As long as links play a major role in the ranking of a website for a given term or terms...
The paper about PageRank is called,"The PageRank Citation Ranking: Bringing Order to the Web." It's citation ranking, not link ranking. The next step is to think about the different kinds of citations there are. I think Google is way out ahead of webmasters again, this time over link-less citations [webmasterworld.com].
Excellent post Brinked.
Goggle should eliminate links as a ranking factor. As soon as links became "currency", the algo fell apart. You can't stop the abusers, so quit trying. There are better signals available. Let people buy all the links they want to drive traffic, but don't rank a site based on links. Who cares that site A "voted" for site B with a link five years ago?
I agree - measure user engagement with the site. User behavior is a much better indicator. You could hire people to visit your site, but that would be hard to maintain for any length of time.
I personally believe the web is at least three times larger than it would be otherwise without bogus links. Way too many phony sites.
And yes, branding is way too strong. I don't need to see Amazon every time I search for a widget.
I dont think we should ever completely dismiss links. I think they are extremely useful. They tell google about new sites, they let google know who is talking about the site etc. I think it would be very beneficial to reduce the amount of power backlinks have. It should be less than 50%, right now I would say its at around 80% or even higher. I think what tedster mentioned in "who clicks on your site" can be extremely useful. If google knows who is a legit searcher and what sites they are finding useful, this can go a long way in telling google which websites actually help these people searching for such and such term.
|5. Make bounce rate a stronger ranking factor. |
I have just searched for "BMW 335i Price" on Google (UK results) and the BMW site appeared. I clicked, noticed the price (£36,340 for the SE) and clicked back.
That was a bad result then? I got exactly what I asked for. I'd say it was a good result.
Bounce rate indicates very little and I am sure Google are well aware of this.
PCInk, Like I mentioned above, I would not make click backs a major ranking factor. It would be very useful in severe cases. For example if a site ranks #1 for widgets and 95% of the clicks result in a click back and the 2nd result only gets 20% clickback, this can raise a flag to google to perhaps do a manual review.
This would be very useful for lets say a medical website ranking high for widgets because of a backlink to it containing the word "widgets" in it. I am trying to preach not putting too much value in any one ranking factor like what is being done with links. It would not be cut and dry and would be part of a much more complicated system like for example if a site ranks #1 for widgets, doesnt has a lot of questionable backlinks, doesnt have much unique content AND has a click back rate over 95%.
There should be a timing measure for click-backs. If you clicked a link, spent a few minutes reading and then clicked back, that's a good result. If you clicked another link and spent 10 minutes reading and then clicked back, that's a better result. If you clicked a link and never came back, that's the best.
|There should be a timing measure for click-backs. If you clicked a link, spent a few minutes reading and then clicked back, that's a good result. If you clicked another link and spent 10 minutes reading and then clicked back, that's a better result. If you clicked a link and never came back, that's the best. |
That is more like it. Like anything else google uses as a factor, there is no clear cut and dry method of implementing it because its not as black and white as "well result number 1 has 90%+ clickback rate so that site must not be of any use" because there are exceptions like PCInk pointed out. Each searcher is different and each search is different. If we know what the users intentions are we can better guage what they find useful and what they don't.
My main website which gets about 60% of its traffic from search engines has a 9.4% bounce rate from search engine traffic, all other traffic is listed at a 30-35% bounce rate. So that means that about 90% of my visitors from search engines found my site to be useful enough to at least take an action and interact, shouldn't that account for something in googles eyes?