homepage Welcome to WebmasterWorld Guest from 54.211.68.132
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 207 message thread spans 7 pages: < < 207 ( 1 2 3 4 5 [6] 7 > >     
Google to Target Overly SEOd Sites
graeme_p




msg:4429949
 3:03 pm on Mar 16, 2012 (gmt 0)

Matt Cutts says:

What about the people optimizing really hard and doing a lot of SEO. We don't normally pre-announce changes but there is something we are working in the last few months and hope to release it in the next months or few weeks. We are trying to level the playing field a bit. All those people doing, for lack of a better word, over optimization or overly SEO - versus those making great content and great site. We are trying to make GoogleBot smarter, make our relevance better, and we are also looking for those who abuse it, like too many keywords on a page, or exchange way too many links or go well beyond what you normally expect. We have several engineers on my team working on this right now.


Article here:

[seroundtable.com ]

Any guesses on what it likely to change?

 

wokka




msg:4431429
 6:29 pm on Mar 20, 2012 (gmt 0)

Ryan
Err... Is your complaint that running a successful business online is becoming more and more like running a successful business in the real world?


No - that's not what I'm complaining about, I live in the real world and the business I run has been difficult from the day I started it.

What I am complaining about is the rubbish results being served atm and the fact that you have to have very deep pockets (like as deep as a massive multi-national's) before you even think about advertising certain products above the fold.

Also, google dishes up SEO advice and then says, oh if you do a bit too much you'll be struck down. What's a bit too much SEO? Very subjective indeed.

wokka




msg:4431436
 6:33 pm on Mar 20, 2012 (gmt 0)

I just think google are just mixing everything up and trying to ruin the 'SEO business' model.

Can you imagine paying your SEO company a $10,000 monthly retainer and then tanking, because they have SEO'd the website.

rlange




msg:4431480
 8:30 pm on Mar 20, 2012 (gmt 0)

wokka wrote:
I just think google are just mixing everything up and trying to ruin the 'SEO business' model.

What obligation does Google—or any search engine for that matter—have to not rock the SEO boat? Seriously.

I think certain people have forgotten what the "SE" in SEO stands for and that those two words are the entire reason the practice of SEO even exists.

Search engines are born, search engines create the need for SEO, SEO bludgeons search engines with claims of victimization. Brilliant. Has it really come to this?

The entire concept of SEO is built on the back of existence of search engines. You have to expect that even a non-malicious shifting of the beast will throw you off if you don't have a good grasp, and certainly abusing the beast will only make it move more violently.

--
Ryan

[edited by: rlange at 8:34 pm (utc) on Mar 20, 2012]

netmeg




msg:4431483
 8:34 pm on Mar 20, 2012 (gmt 0)

No.

rlange




msg:4431495
 8:55 pm on Mar 20, 2012 (gmt 0)

netmeg wrote:
No.

Yeah, that was overly dramatic. Honestly, I think wokka's the only one I've seen express concern that this is an attack on SEO. Most, however, seem to see it as an attack on webmasters (or, more specifically, webmasters of not-eBay/Amazon/Wikipedia).

I'm not sure which is worse, but neither claim seems to be supported by the short comment from which this thread sprung...

--
Ryan

reseller




msg:4431507
 9:18 pm on Mar 20, 2012 (gmt 0)

I think in case Google launches an anti-over-optimization algorithmic improvement to its ranking, such a change would impact around 0.5% of the queries.

netmeg




msg:4431576
 2:30 am on Mar 21, 2012 (gmt 0)

Actually my "no" was in response to this:

Has it really come to this?

tedster




msg:4431580
 2:59 am on Mar 21, 2012 (gmt 0)

I also think we need to look at this from the other direction - a boost to sites that show signs of really good content but weaknesses in traditional SEO. It will be interesting to see how this shakes out - both immediately and in the long run.

It still seems to me that great content PLUS good technical SEO should top great content with lousy technical SEO.

Here are some random thoughts:

  • Maybe Google will start to use OCR more powerfully and treat text images more like text.
  • Maybe their Flash and Ajax handling has proven itself to a greater degree and is ready for prime time.
  • Same goes for URL discovery through forms
  • Maybe their experiments in handling duplicate URLs are solid enough that (in some cases at least) things like non-canonical URLs and soft 404s will no longer be a big negative.
  • Maybe their confidence in weeding out artificially created social (and other user-based) signals has grown.

    ...etc, etc, etc.

    And yes, maybe there will be some issues, especially when this first rolls out. I'm thinking of a bigger percentage impact than reseller is - more like 4% to 5% of queries. But it's all guesswork right now anyway.

  • Whitey




    msg:4431603
     5:20 am on Mar 21, 2012 (gmt 0)

    more like 4% to 5% of queries. But it's all guesswork right now anyway.

    That's potentially 17% cumulative of queries with Panda. The problem for some is it's a lot stronger is some vertical segments and money producing sites, built for income, which tend to be more aggressive in their SEO.

    I'd ramp that up a bit in those segments. Prepare for some noise and anguish, followed by a long period of analysis. Hopefully it will be easier to figure out, even though it may prove difficult to adjust to for some.

    graeme_p




    msg:4431623
     6:20 am on Mar 21, 2012 (gmt 0)

    Goo wants to streamline the web and give big players the spotlight while weeding out the noise and redundancy on the net.


    Why would Google want to do that? It is in their interest to maintain variety, because that is why we need search engines in the first place. If we always end up on Amazon/Ebay/Wikipedia why not go straight there and skip Google?

    Whitey




    msg:4431632
     7:10 am on Mar 21, 2012 (gmt 0)

    Why would Google want to do that?

    I don't know , but i recall Eric Schmidt, a while back, saying that Google would be announcing some adjustments in the SERP's that would create some softening of results and growth, in favor of longer term gain. He may be connected in this to his other statement about the "cesspool" of websites that he described.

    I'm not sure of the overall strategy, but Google's abilty to control content on their assets, or assets they dominate, like maps etc., is key, and "owning" listings in some form might be part of that goal. Locking in the "big guys" such as airlines, banks and replacing the Yellow Pages are the sorts of big plays that make "little guys" irrelevant i think.

    Search quality is just a bi-product in planning the end game mega - monetization IMO.

    Google seems to want to provide an interactive brochure replacement, and i kind of picked this up in Amit Singhal's post [googlewebmastercentral.blogspot.com.au...]

    The thinking has to be reflective of a part of an overall corporate vision of where search is headed from a business pespective.

    [edited by: Whitey at 7:27 am (utc) on Mar 21, 2012]

    wokka




    msg:4431633
     7:26 am on Mar 21, 2012 (gmt 0)

    [hobo-web.co.uk...]
    Shaun Anderson has been running a small test on whether the meta description tag plays any relevance for a keyword search.

    It seems like it is relevant from his tests - maybe another over optimisation signal?

    tedster




    msg:4431688
     11:58 am on Mar 21, 2012 (gmt 0)

    We don't normally publish links to other blogs and forums, except when the link points to information from official sources of some kind. However, we'll make an exception in this case because the meta description test is interesting and it seems to change the previous status of the element.

    I'm not clear how a meta description could be an over-optimization target. Maybe if it was stuffed to the gills or something like that? How very old skool that would be!

    BillyS




    msg:4431703
     1:17 pm on Mar 21, 2012 (gmt 0)

    The fact meta description appears in WMT tells me Google cares about this information. Title and description (not keyword meta) seem important.

    SEOs stuffing the description deserve to be spanked.

    reseller




    msg:4431718
     2:09 pm on Mar 21, 2012 (gmt 0)

    S it seems we have reached in our current discussion to the point where we talk about possible over-optimization signals.

    I would expect Matt Cutts and the other friends at Google Search Quality Team to consider the following over-optimization signals:

    - Keyword stuffing at <title></title>

    - Keyword stuffing at meta description

    - Keyword stuffing at body text

    - Backlink schemes

    - etc....

    You are most welcome too add to above list other possible over-optimization signals.

    rlange




    msg:4431731
     3:04 pm on Mar 21, 2012 (gmt 0)

    Whitey wrote:
    That's potentially 17% cumulative of queries with Panda.

    I'm not sure it's useful to think of it this way, for two reasons: 1) Panda's been around for just over a year and clearly isn't going anywhere, so I think it's time to reset the baseline, and 2) it ignores overlap. I know you used the word "potentially", but even that logically leads to absurd statements like, "that's potentially 250% cumulative of queries with all previous Google updates".

    For sanity's sake, I think it's better to not start adding up the effects of any number of updates.

    --
    Ryan

    sundaridevi




    msg:4431764
     4:52 pm on Mar 21, 2012 (gmt 0)

    I'm thinking more along the lines of traditional SEO (like the points that reseller lists above). So rather than thinking that Google will add remote voice recognition and automatically demote you when it hears the word SEO at your work station, I'm betting that it will just look at traditional SEO stuff and draw some lines.

    For example, a site with no link exchanges could be considered to have done less SEO than a site that has exchanged 1000 links (no matter how relevant), etc, etc, etc.

    reseller




    msg:4431796
     5:46 pm on Mar 21, 2012 (gmt 0)

    Here is some food for thoughts from Matt Cutts which might be very relevant to our current discussion!

    Matt Cutts said:

    Let me reiterate a point to the search engine optimizers (SEOs) out there: SEO is a field that changes over time, and the most successful SEOs embrace change and turn it into an opportunity. SEOs in 1999 didnt think about social media, but theres clearly a lot of interesting things going on in that space in 2010. [mattcutts.com...]

    BillyS




    msg:4431817
     6:29 pm on Mar 21, 2012 (gmt 0)

    Good point Reseller, I know a lot of webmasters have been hard at work paying for, or encouraging, "likes" and other social signals. Perhaps this might be added to your list.

    For example, Google may have been looking at this as a quality / engagement signal, but now tweaking that dial to eliminate some of the gaming we're seeing.

    BTW - I notice that a lot of the big brands jumped on that bandwagon too, offering what look like incentives to click that "Like" button.

    rlange




    msg:4431844
     7:20 pm on Mar 21, 2012 (gmt 0)

    BillyS wrote:
    Good point Reseller, I know a lot of webmasters have been hard at work paying for, or encouraging, "likes" and other social signals. Perhaps this might be added to your list.

    I don't think bringing attention to social media was the point of Mr. Cutt's comment. He's basically saying, "Don't get too attached to current SEO practices; they've changed in the past and they will continue to change." Social media was just his example of how things have changed.

    --
    Ryan

    tedster




    msg:4431956
     3:31 am on Mar 22, 2012 (gmt 0)

    If I were Google, another "over-SEOd" signal I'd look for is lot of recently expired domains being purchased and 301 redirected to the core site. Not lots of domains alone, but lots of EXPIRED domains.

    reseller




    msg:4432169
     4:55 pm on Mar 22, 2012 (gmt 0)

    I don't think SEOes who operate within Google quality guidelines have to worry if Google launch an Over-Optimization algorithm update. The said update would only affect spam sites. However I think that we should expect some collateral damage caused by such update.

    claaarky




    msg:4432234
     7:10 pm on Mar 22, 2012 (gmt 0)

    To me, the only way Google could ever really "level the playing field" is by nullifying the effects of anything done for SEO purposes (easy to say I know, but maybe Google is almost there).

    Anyone can still buy their way to the top of Google if they have a good enough SEO company on board and deep enough pockets. Your SEO may be ethical but ultimately it's mostly about obtaining more top quality links than your competitors.

    Google has made the job of an SEO harder and harder over the years and that means it's more and more expensive to hire a good one. Small companies push themselves to their financial limits trying to keep up with the SEO effort of larger competitors and really it's a crazy game which will come to an end one day, maybe soon.

    I hope so. Until a few years ago we didn't need an SEO. These days it's our largest overhead and is prioritised above every other cost, including additional staff. We have to be able to compete but if it comes down to who has the deepest pockets then we can't.

    reseller




    msg:4432277
     9:09 pm on Mar 22, 2012 (gmt 0)

    I think a possible over-optimization update would roll out at the beginning to USA and several months later world wide.

    Whitey




    msg:4432278
     9:12 pm on Mar 22, 2012 (gmt 0)

    I don't think SEOes who operate within Google quality guidelines have to worry if Google launch an Over-Optimization algorithm update.

    Google's guidelines are intentionally abstract - so how can anyone be sure of the practical specifics.

    outland88




    msg:4432301
     10:10 pm on Mar 22, 2012 (gmt 0)

    Im not so sure of that either. Ive seen many web pages that exceed every degree of optimization and go un-penalized. In fact Matt clouded the issue even more, a while back, when he pronounced keyword stuffing was not necessarily a justification for a penalty. I can agree to some degree because I had made verbatim the same argument to him. But it did not include repeating the same elements over and over or putting up these columns of hot phrases.

    IMO manual OOP penalties were utilized by Google for a while and they lost interest in it leaving quiet a few wondering why penalties were never lifted.

    [edited by: outland88 at 10:54 pm (utc) on Mar 22, 2012]

    defanjos




    msg:4432304
     10:19 pm on Mar 22, 2012 (gmt 0)

    when he pronounced keyword stuffing was not necessarily a justification for a penalty


    Because it is that in conjunction with a few other things. When several elements are perfectly aligned, then you'll trigger the over-optimization penalty.

    Don't just think on-page elements either.

    tedster




    msg:4432306
     10:22 pm on Mar 22, 2012 (gmt 0)

    They're not THAT abstract! If you are trying to push the edges, then you are looking for the kind of certainty that doesn't exist in life. And if you do that, then you should know what edges you pushed on when something blows up.

    In my experience on this forum, problems come up for people when they don't really read or pay attention to what Google says. Instead they read what other people say ABOUT what Google says. Then the mythology starts.

    People say Google generates FUD. That maybe true at times, but it's nowhere near the amount of FUD that webmasters and SEOs turn out.

    Whitey




    msg:4432340
     11:27 pm on Mar 22, 2012 (gmt 0)

    True , but when you read something like [googlewebmastercentral.blogspot.com.au...] how is the average webmaster to know what that means?

    Each line item has a specific set of questions associated with it. Then there's the interpretation on the words.

    It's OK on the one hand to speak about pushing limits which webmasters should recognise, but on the other hand it's not easy to rely on interpretations from Google, let alone webmasters who push interpretation beyond myth. Google could do a lot more to assist here and tighten up it's guidelines to something more specific.

    I recall many years back it took them 2 years to write a set of guidelines that answered the more common duplicate content issues. Should we be thankful and accept our fate, or asking for clearer communication on specifics?

    Or is it that Amit Singhal has such a complex set of algorithmn arrays that not even he is able to give specifics anymore?

    Therefore, IMO, not everything in writing can be relied on, and there's more edges than substance in some of Google's shifts. Not sure how this will play out in the upcoming optimization shift.

    scooterdude




    msg:4432346
     11:41 pm on Mar 22, 2012 (gmt 0)

    A question for y'all

    What would happen if all traffic where evenly distributed between SEO'd and non-SEO'd sites

    tedster




    msg:4432348
     11:44 pm on Mar 22, 2012 (gmt 0)

    I thought that was one of the best blog posts ever put out - extremely informative.

    My take on it - these are roughly the criteria that were given to human users. Those humans then established the training sets that were then used to develop Panda, via machine-learning algorithms.

    Or is it that Amit Singhal has such a complex set of algorithmn arrays that not even he is able to give specifics anymore?

    That's the way it is when a year's worth of intensive machine-learning and self-adjusting algorithms come into the picture. Google engineers have been talking about the eigenvalues of infinite Hermitian matrices and the like for many years.

    I'm sure Amit "could" get a good bit more specific if pressed, but it wouldn't be wise to show that many cards. And when it comes down to the real nitty gritty, he'd have to research the current code-base that's in use to answer anything in fine detail.

    All this is why I said in another thread that "checklist" or "punch list" SEO is no longer an effective mental model for the Google black box. If you try to understand SERP fluctuations that way, you'll be lost in mystery and probably become very angry after a while.

    This 207 message thread spans 7 pages: < < 207 ( 1 2 3 4 5 [6] 7 > >
    Global Options:
     top home search open messages active posts  
     

    Home / Forums Index / Google / Google SEO News and Discussion
    rss feed

    All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
    Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
    WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
    © Webmaster World 1996-2014 all rights reserved