homepage Welcome to WebmasterWorld Guest from 54.145.183.126
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 174 message thread spans 6 pages: < < 174 ( 1 2 3 [4] 5 6 > >     
Google Updates and SERP Changes - December 2013
Martin Ice Web

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4627107 posted 1:31 pm on Dec 1, 2013 (gmt 0)


System: The following 3 messages were cut out of thread at: http://www.webmasterworld.com/google/4620994.htm [webmasterworld.com] by aakk9999 - 3:08 am on Dec 2, 2013 (gmt 0)


@wilburforce, google says the the 404s are linked from a page that has been deleted two years ago. The links on this page are not up to date. google keeps this links in their robots database even if you 410 them.
Very big problem here.

 

EditorialGuy

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4627107 posted 7:46 pm on Dec 11, 2013 (gmt 0)

I have come to believe the 500+ updates per year - which translates to an average of 1.36 per day - is more about obfuscating the algorithm (to SEOs, competitors, possibly even government regulators) than about delivering the best results.


It may also reflect the fact that Google is a data-driven company that relies heavily on testing. If every tweak to the algorithm is monitored, analyzed, and acted upon, the search results are likely to change. (Never mind the fact that Google uses geotargeting and personalization, including integration of G+ posts, news stories, YouTube videos, etc. into Universal Search--all of which are likely to make the results more fluid than they were in the days of "10 blue links" and monthly updates).

diberry

WebmasterWorld Senior Member



 
Msg#: 4627107 posted 8:41 pm on Dec 11, 2013 (gmt 0)

Yeah, but how do you make 500 sensible, carefully measured "updates to the algo" per year with no other consideration but making users happy? I'm suggesting that a minority of those updates are about users, and the rest are just shaking stuff up in ways that won't hurt revenue, but will confuse SEOs and anyone else seeking to decode the algo. '

Because Google is data driven, it makes sense they want to keep the algo inscrutable not just to SEOs but also to competitors and government regulators. (No, I'm not suggesting they're doing anything illegal in the algo - merely that Congress has already tried to hold Google accountable for the algo delivering pirated sites when that's what searchers are after, so who knows what conclusions they'd reach if they actually understood what the algo was trying to do).

EditorialGuy

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4627107 posted 8:56 pm on Dec 11, 2013 (gmt 0)

I'm suggesting that a minority of those updates are about users, and the rest are just shaking stuff up in ways that won't hurt revenue, but will confuse SEOs and anyone else seeking to decode the algo.


That might make sense if Google felt a need to obfuscate, but is obfuscation really necessary with so many variables in the mix? I'd guess that the algorithm and its filters would remain inscrutable with or without random juggling, but still, your guess is as good as mine.

diberry

WebmasterWorld Senior Member



 
Msg#: 4627107 posted 9:08 pm on Dec 11, 2013 (gmt 0)

your guess is as good as mine


Agreed. For me, it's as much that I just can't fathom Google thinking the algo is such a mess that it needs 500+ updates a year to be good enough. Also, I don't see the SERPs improving in the way that 1.36 updates per day implies in my mind (I mean, Google is overflowing with brains - surely these updates all have a point.)

So I ask myself what else it might be about, and yeah. Google's competitors are also extremely smart, and have been to the US govt already demanding anti-trust suits. They'll be back, and Google needs to be sure even THEY can't decipher the algo. Or at least I think that's how I'd play it in Google's place.

webcentric

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4627107 posted 9:18 pm on Dec 11, 2013 (gmt 0)

If I'm right, that means there just won't be any further deciphering of these algo updates - all 500 of 'em. When they released Penguin, they warned SEOs they would never manage to backward engineer it. The best way to keep someone from backward engineering something is to throw in statistical outliers ("false positives" or "false negatives") that cause the update to "not make sense" to humans.


That might make sense if Google felt a need to obfuscate, but is obfuscation really necessary with so many variables in the mix? I'd guess that the algorithm and its filters would remain inscrutable with or without random juggling, but still, your guess is as good as mine.


Everyone's guessing! Obfuscation through variables. What a concept. As long as we're guessing, my guess is that Google has an algorithm just for tweaking its own algorithm. That way only algo1 knows what algo2 is actually doing. Who's to say Google doesn't know that I'm a webmaster and simply throws me a curve ball in the SERPs every time I search whereas non-webmasters get a completely different set of results. Personalization could easily manage that. I run one search from the computer in my office and get one set of results. Go upstairs to my girlfriend's computer and it's a whole new ballgame. She's a shopper, I'm a webmaster. I agree with diberry, there is obfuscation built into the SERPs at some level and it may actually be a pseudo-sentient persona in its own right by now. It's just a guess mind you but it's also an explanation that fits the current landscape. If the concept of "obfuscation by design" is built into the system then the future of SEO is truly a long shot, unless perhaps, you're willing to forget everything you ever learned about SEO and start building for humans once again. Oh, and perhaps for those who haven't already done so, diversifying your game might be a good idea. At this point it can't hurt. If SEO is the subject here then anything you can do to complement your SEO strategy and or hedge your bet is at least worth mentioning. Today a sound SEO strategy needs a backup plan unless of course you're the only one here who isn't guessing.

Wilburforce

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4627107 posted 9:33 pm on Dec 11, 2013 (gmt 0)

Who's to say Google doesn't know that I'm a webmaster and simply throws me a curve ball in the SERPs every time I search


Because you use proxies and try out searches similar to those in your server logs on more than one computer?

If you search while you are logged in to WMT or Analytics, Google does know you're a webmaster.

webcentric

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4627107 posted 9:49 pm on Dec 11, 2013 (gmt 0)

Because you use proxies and try out searches similar to those in your server logs on more than one computer?


Well, if I spent all day surfing through a proxy that would be one thing. Without it, an IP address or a MAC address is all that's required. No need to be logged into any Google service. Just to have been there in the past would be enough. But I get your point about testing through a proxy. You know though that messing with variables could easily be accomplished with a random number generator. It doesn't even have to be targeted and I think if you look at this board for the year, what people are seeing looks like it came out of a random number generator. Constant, unpredictable change. How do you optimize for that?

Added: I'll just add that it makes this thread look like a dog chasing it's tail sometimes.

Martin Ice Web

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4627107 posted 7:56 am on Dec 12, 2013 (gmt 0)

Realy big hit this morning, we lost about 50% from what is normal at this daytime.
A big shake up with multiple listing from one domain again.
This all smells like an early Christmas present from the Panda bear, certainly only for Google earnings.

We see a lot of new bots since this updates began in November. Including a new googlebot/2.1 that we didnīt notice before.

ecom, germany

Jez123

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4627107 posted 8:48 am on Dec 12, 2013 (gmt 0)

Agreed. For me, it's as much that I just can't fathom Google thinking the algo is such a mess that it needs 500+ updates a year to be good enough.


It's sort of ironic when they try not to make any hands on changes to the results but now they have to manipulate the algos 500+ times a year.

viral



 
Msg#: 4627107 posted 9:58 am on Dec 12, 2013 (gmt 0)

quie frankly the winners out there at least amongst my clients are the ones that don't concentrate on Google.

I have startup clients that solely concentrate on driving Reddit traffic yet Google ends up loving them.

This is going to sound kind of braindead but they all have one thing in common. They all have huge non Google traffic.

Want my tip for 2014? You don't need SEO you need traffic. If 80-90% of your traffic is coming from Google. Google knows this! and as a result they don't like you! and probably will never go back to liking you until they have some compelling reason.

That reason is that other sites and users are liking you again and visiting without finding you through Google. Then guess what Google will bring you back into the fold.

Google is long past caring about only links and on page seo. Now they want to know you are popular for your niche in your own right.

Wow does that sound like a brand?

I have said before I have one client that is ranking for some crazy competitive commercial terms and they aren't even targeting them at all. All this site has is 800k unique visitors on average a day coming there to look at pictures.

Really all this site has going for it is, is an over abundance of "non Google" traffic. I keep the on page SEO in check but that is about it.

I hear people constantly complaining about Amzon here at WebmasterWorld . The complaints are usually about how bad the on page SEO is on the landing page. I am here to tell you on page SEO is virtually dead and here is the one that will get me shouted out of the forum...

Quality of content doesn't matter one bit!

What Google wants to know is are you popular outside of Google. If you are popular, Google will forgive everything else you have going against you and rank you anyway.

How can you guys out there in small niches fight against this?

By doing other things to drive traffic.

Social is important! Many of you hate talking about it.

but it is one of the only ways to get non-Google traffic to your site so that Google will eventually like you again.

We can all debate the ins and outs of Pandas and Penguins until the cows come home but that is just a slow spiral into oblivion.

Google only really needs 2 or 3 pages of results and in almost all niches it can do that "without your site being part of the mix".

Gone are the days of the Google free ride. Google would much rather push users at a knowledge graph than your site. better bottom line for them!

Martin Ice Web

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4627107 posted 10:25 am on Dec 12, 2013 (gmt 0)

viral, i agree in many Points, but this does not explain why are so many silly and stupid pages are pn page #1. You canīt tell me that this pages are popular. Maybe black hatīs have find a way to control this.

Wilburforce

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4627107 posted 11:38 am on Dec 12, 2013 (gmt 0)

the winners out there at least amongst my clients are the ones that don't concentrate on Google


The only site in my sector that has remained consistently in the top couple of pages for key terms has clearly done nothing that could be counted as SEO.

From here, it looks like anything - from being linked to by a relevant forum to using <b>Key Term</b> (or h1, Title, you name it) - is now potentially damaging, and the sites that have survived are there by exception, not for any inherent merit.

Arguably (and certainly looking at the current SERPS), this is counter-productive for promoting quality: all you have left is sites created by people who haven't a clue or can't be bothered.

Constant, unpredictable change.


I am not sure it is unpredictable. Let us concentrate on what we know:

1. It looks as if there are multiple markers which (singly and/or in combination) can invoke a draconian penalty.
2. Historical data are probably involved, so you can be penalised today for a link that no longer exists, or for content that has been rewritten or removed.
3. There is a considerable delay between implementing changes and the result being recognised (let alone acted on) by Google. Of 184 links I removed on 9 October, 155 are still showing in WMT. Clearly the massive disavowal I carried out at the same time is proceeding at the same pace.
4. The fact that "updates" of one sort or another are being carried out with such frequency suggests increasing complexity of the algorithm(s) being used.

Prediction is made difficult by the complexity of the algorithm, by the delays in on- and off-site changes taking effect, and by the fact that multiple algorithm changes will take place between any site-alteration and its recognition. However, that does not in any sense at all make what is happening random.

EditorialGuy

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4627107 posted 3:35 pm on Dec 12, 2013 (gmt 0)

About the 500 algorithm tweaks each year and the "constant, unpredictable change":

Another reason could be replacement of the monthly "Google Dance" with "slipstream updates." Instead of bundling a bunch of tweaks together into a single update every month, the tweaks are rolled out when they're ready to be deployed and/or tested.

In other words, there may be nothing new about "500 updates a year." What's new is having them rolled out individually instead of being combined into a dozen monthly updates.

EditorialGuy

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4627107 posted 3:39 pm on Dec 12, 2013 (gmt 0)

this does not explain why are so many silly and stupid pages are pn page #1. You canīt tell me that this pages are popular. Maybe black hatīs have find a way to control this.


I sometimes see "silly and stupid pages" on page 1 of the search results, but many (most?) of them appear to be there by dumb luck, or because of an exact-match domain that involves a location name (such as a city name), not because of any blackhat techniques. If black hats had created the pages or sites, they would have put more effort into monetizing them.

Martin Ice Web

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4627107 posted 4:35 pm on Dec 12, 2013 (gmt 0)

@EG,

i see this kind of pages

1. brands - mostly unrelated, just because they are brands
2. plain, content-less pages, just the appearance of the key in title or bodytext
3. sites with stuffed keywords more then 10% of main text
4. complete unrelated sites because google insists of showing simillar keywords in serps
5. sites with amazon or ebay ads/affiliate links on it ( not even own content )

If this is compelling then the guys at google might have smoke to much of their $$$ notes.

if amazon ranks on page one, i can ok its OK its big allthough amazon is nothing than a big affiliate, scrapping site with to much money.

Meanwhile this algo change reminds me of the update in summer 2012. Same pattern, same odd results.



rowtc2

5+ Year Member



 
Msg#: 4627107 posted 5:22 pm on Dec 12, 2013 (gmt 0)

Ruined by Google updates for my main site, I have started 2 new sites in other niches, adding few pages each day, best quality i can.

First site is doing well (reached 100 visits/day) after getting 2 links - created by me.
Second site doesn't have a single visitor and have 0 links. How can i get links if people cannot see it?

Big brands have PR departments with people that are posting each day links on forums, social media, on all channels.

ohno



 
Msg#: 4627107 posted 5:37 pm on Dec 12, 2013 (gmt 0)

/\ we have links from forum users linking to our products yet Google has still hammered us. I guess they are posting spam eh Google. Today a ZERO sales day during the week, the rot just gets worse. Started with weekends, then they took evenings away, now we have nothing. At least the SERP's have moved yet we are UP, makes no sense.

ETA, although one theory is people are ignoring organic that is near the ads. I did this myself the other day, I ignored the top result as I thought it was just another ad! I wonder if anyone is doing the same?

Dymero



 
Msg#: 4627107 posted 7:34 pm on Dec 12, 2013 (gmt 0)

Yeah, but how do you make 500 sensible, carefully measured "updates to the algo" per year with no other consideration but making users happy?


There seems to be an assumption here (but I could be wrong) that all 500 updates are at the level of Panda, Penguin, and Hummingbird. However, as I understand it, at least in the beginning, Google would issue a big update, then take a look and notice that something wasn't quite "right" in a subset of results. So they'd go back and tackle that one area.

There is plenty of evidence that Google still does this, especially after they went after content farms a couple years ago, and I believe they tackled the payday loans SERPs earlier this year. So that's definitely included in the 500 updates, along with a huge amount of other small things that nobody really notices specifically but might contribute to the current state of everflux that we see.

goodroi

WebmasterWorld Administrator goodroi us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4627107 posted 7:43 pm on Dec 12, 2013 (gmt 0)

Don't mind me, I am just going to actually talk about trends I see in the Google serps instead of spending my time complaining which does me no good...

I still see some black hat techniques working but they tend to be short lived. One of the extreme cases I saw recently was a hacked restaurant website. It was ranking for some good money terms and took Google about 3 months to kill it.

The longer term ranking success I see is for people making their websites more locally relevant. Google seems to be trying to show more and more locally relevant serps even if you are not logged in. Some webmasters have built out landing pages for the different areas they service. These internal pages tend to rank better than their generic homepages. These landing pages are not cheap auto-generated pages with no original content. They have a decent amount of original text with relevant local terms & synonyms. I see these pages performing well.

I also see more and more coorelation between sites that gain high usage (from google or non-google source) getting rewarded with higher rankings.

taberstruths



 
Msg#: 4627107 posted 7:50 pm on Dec 12, 2013 (gmt 0)

Goodroi,

In your opinion how is Google coming up with this data of website usage? Adsense? Chrome? GA? or some other method?

Martin Ice Web

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4627107 posted 10:14 pm on Dec 12, 2013 (gmt 0)

Deal or no Deal?

I was wondering that google shows for every single query in my niche a summary page from ebay in germany. Everytime it is a ebay/phb/... url. If this pages are so great, why does bing not even show them once in a search on the first 3 pages? Is this a special deal with google? My niche is full of this summary pages. This pages replaced all other ebay widget pages.

backdraft7

WebmasterWorld Senior Member



 
Msg#: 4627107 posted 12:05 am on Dec 13, 2013 (gmt 0)

Just an observation from my POV...we were experiencing a nice seasonal increase, but at the end of November something happened to slam on the brakes. Now this month we are trending down week to week with hardly any traffic at all the last few days almost as if we've been hit by a penalty. Anyone know of a very recent major algo update?

As a side note, we did switch hosting providers and changed IP address of the site two weeks ago. Almost seems like we're sand boxed. 13 year old membership site.

goodroi

WebmasterWorld Administrator goodroi us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4627107 posted 12:23 am on Dec 13, 2013 (gmt 0)

To be honest I am not 100% sure how or if Google is looking at usage signals. I just see a very strong correlation.

If I was Google and had a bunch of phd employees with a big basket of money, I would not just look at one usage signal. I would look at multiple signals. If you honestly have traffic it will show up in chrome usage, isp logs, type in searches, etc. If you use just one signal it would be possible for spammers to manipulate it but using multiple signals can expose spammers that are manipulating one signal.

Even if Google is not looking at usage signals, I would still chase after it. Why? Because developing external traffic sources makes me independent from Google. By getting relevant websites to link to me I develop more direct traffic with the side benefit of increased link popularity and usage signals.

JD_Toims

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4627107 posted 12:35 am on Dec 13, 2013 (gmt 0)

To be honest I am not 100% sure how or if Google is looking at usage signals. I just see a very strong correlation.

Personally, I'd go with click and Chrome -- That's what I've heard/read anyway ;)

viral



 
Msg#: 4627107 posted 1:31 am on Dec 13, 2013 (gmt 0)

Google is definitely looking at usage signals. There are so many places for them to get this information from chrome/toolbar plugin/analytics. Even sites like Alexa.

Some people I talk to will say yeah but that information doesn't cover all sites out there. Not all sites have statistics gathered about them by these methods.

My answer to that is Google doesn't care about sites it can't gather stats about. If you aren't showing up in the methods above then guess what you probably won't rank and rightly so as far as Google is concerned.

We live in the age of crowdsourcing, why would Google not use the wealth of info it gathers through users to help determine ranking?

I have a client who refuses to use analytics as he doesn't want Google or anyone else knowing his business. The argument I am constantly having with him is you want traffic from Google but you aren't willing for them to gather stats about your site? Google is slowly but surely sending him less traffic. I try and remind him that Google can fill up the first 3 pages of the results with sites that are as good as his and also allow them to gather usage stats.

Having said all that, I don't think the algorithm is all about popularity only. I think the popularity of your website does the heavy lifting in getting you a place at the front of the starting grid. After that it is things like onpage seo, links and yes even content quality.

However if your popularity/"usage statistics" are through the roof you don't need those things you will rank anyway. This is why Amazon ranks.

This is all just a long winded way of saying that the brand signal has been turned up to 11.

Unfortunately this is the world we live in. How do we become a brand in Google's eyes? by driving traffic from other sources.. Like I have already said.

And like I have said many times the easiest way for webmasters to do this is by driving social traffic. This is harder in some niches than others but in the end you are going to have to try and get some of this traffic, if you want to continue to get Google traffic.

Dymero



 
Msg#: 4627107 posted 4:14 pm on Dec 13, 2013 (gmt 0)

I'm seeing a couple trends in my niche that may be minor but are interesting nonetheless.

First, the "recognizable brand retailer" site seems to be ever so slightly losing its grip on universal first or second positions, at least in some limited situations. Where this happens, Google seems to prefer the manufacturer site, even though these sites almost never sell the product themselves.

I'm also seeing an increasing amount of news articles about the product interspersed with the commercial results in these SERPs. These are not news vertical results, as far as I can tell, but part of the main organic results.

Both of these things would seems to indicate a slight preference for, or testing of, a less commercial SERP. It's nowhere near being a widespread thing, but it does provide something to think about.

Martin Ice Web

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4627107 posted 6:38 pm on Dec 13, 2013 (gmt 0)

@Dmero, i canīt see this trend. In my niche are never seen ecoms pushed to the front. Pages that needs 10 seconds to load, pages with nearly no content on it, pages that seem to be written in late night hours.
Lots of domain crowding again. Whereas google shopping ads are on the point, never seen google shopping ads beeing so good in my niche. Before this "quality" update for organic serps, google shopping ads seem to have some fuzzy logic but now they show all ads with the widget you search for.
And while google is big brands biased my screen shows me
-7 links from a big reseller ( 1 adwords, 3 shopping ads, 2 organics, 1 picture )
-1 organic result from a different site.

We never seen the organics this bad. And we have seen a lot of bad serps but this update is the worst of all.

Meanwhile one reseller that was not very good in serps is on page 1 for nearly every query. The only think they changed ( the site is realy scarry ) is that they "gained" over 12.000 google+ followers in a realy short time. This indicates that google+ followers ->is<- a ranking factor.
As google+ in germany is allmost not existent and nobody knows it, i doubt that this followers are real.

In fact it is time to buy some followers! Lets spend 50 bucks.


Wilburforce

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4627107 posted 8:19 pm on Dec 13, 2013 (gmt 0)

I'm also seeing an increasing amount of news articles


@Dymero

That is interesting. In my sector there was a period (from late August through the first part of September) when news stories appeared in my sector for a couple of key terms.

Typically these were clustered - with six or seven news items appearing either in a solid block, or within a range of about ten results.

I haven't seen it since (and although a couple of people reported similar findings in other sectors at the time, I haven't personally seen it anywhere else).

What you describe sounds like a similar thing - they are probably trialling some News integration module - so keep us posted on this.

i canīt see this trend


@Martin

If this is anything like what I saw, it is probably limited to a fairly narrow range of search terms/locations/sectors.

EditorialGuy

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4627107 posted 9:35 pm on Dec 13, 2013 (gmt 0)

I'm also seeing an increasing amount of news articles about the product interspersed with the commercial results in these SERPs.


I've seen the same thing for some informational queries where the newspaper or magazine articles were barely relevant or were years out of date (e.g., an index of archived New York Times articles from the late 1980s for a travel search).

Maybe Google has cranked up the "authority" setting for major news sites in general Web search?

backdraft7

WebmasterWorld Senior Member



 
Msg#: 4627107 posted 9:51 pm on Dec 13, 2013 (gmt 0)

Wow - from gangbusters to zero traffic in the last two weeks. WTF happened?

backdraft7

WebmasterWorld Senior Member



 
Msg#: 4627107 posted 4:06 pm on Dec 15, 2013 (gmt 0)

amazingly quiet here...dead quiet on the site too, suddenly 42% mobile and all are one hit and gone. Traffic continues to nose dive FNFR.

This 174 message thread spans 6 pages: < < 174 ( 1 2 3 [4] 5 6 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved