| 6:57 pm on Jul 15, 2011 (gmt 0)|
My favorite quotes so far:
Link spam Versus Social spam:
"At SMX Munich Rand Fishkin heard from Stefan Weitz and Maile Ohye that itís a lot easier to recognize gaming of social signals than it is to recognize link spam."
"The social signals have more patterns and footprints around them. Also, the code that search engines use has gotten more sophisticated, and they have access to more data."
The Value of Inbound Links
"Another thing I hear people talking about is that over time Google is looking to supplant links with other signals. My take on this is that links are still going to be a good signal, but they are not going to be the only signal."
"Google has been saying that for years. I donít think the value of links will ever go away. Theyíll continue to be augmented with more data, which will make the value of links less important because there are many other signals now in the mix."
| 7:21 pm on Jul 15, 2011 (gmt 0)|
This section of the interview, to me, is very revealing...
|Eric Enge: Right. There is another element I want to get your reaction to which I refer to as the ďsamenessĒ factor. You may have a great user experience. You may have a solid set of articles that cover hundreds of different topics, and they may all be in fact original. However, itís the same hundred topics that are covered by a hundred other sites and the basic points are the same, even though itís original, there is nothing new. |
Vanessa Fox: Right. I think thatís where added value comes into play. Itís important to look and see what other sites are ranking for. What are you offering that is better than other sites? If you donít have anything new or valuable to say then take a look at your current content game plan.
Eric Enge: So, saying the same thing in different words is not the goal. I like to illustrate this by having people imagine the searcher who goes to the search results, clicks on the first result and reads through it. They donít get what they want so they go back to the search engine, they click on a second result and itís a different article, but it makes the same points with different words.
They still didnít find what they want so they go back to the search engine, they click on the third result and that doesnít say anything new either. For the search engine it is as bad as overt duplicate content.
Vanessa Fox: Thatís absolutely right.
Eric Enge: It may not be a duplicate content filter per se, which is a different conversation than this one, but the impact is the same. Itís almost like an expansion of query deserves diversity, right.
The search engines have always said they want to show unique results, diverse results, valuable results.
Vanessa Fox: Right. These concepts have all been around for a long time, but we are seeing them perhaps played out with different sets of signals, but they are not anything new. The search engines have always said they want to show unique results, diverse results, valuable results, all these things.
The "sameness" between sites is growing... after all there are only so many topics/subjects and only so many ways to discuss/article same. For those operating in highly "duplicate competition" niches this can be the kiss of death... so... perhaps less pages and more topic per page could be beneficial not only to the user but the webmaster, too.
| 7:49 pm on Jul 15, 2011 (gmt 0)|
Good point, Tangor:
I remember Matt Cutts saying that what is needed is NEW information, new conclusions, new interviews in people.
I don't even know if it has to be accurate or truthful to rank. I keep telling myself that if I ever have any spare time to test things, I am going to write articles about how whale flatulence is the primary cause for global warming, and back it up with half truths and made up statistics, just to see if it will rank for "global warming causes."
| 7:51 pm on Jul 15, 2011 (gmt 0)|
|perhaps less pages and more topic per page |
|What are you offering that is better than other sites? If you donít have anything new or valuable to say then take a look at your current content game plan. |
I'm not sure what Vanessa means by reviewing the content game plan. Definitely a review is in order as she recommends, and I don't mean to put words into her mouth but that review should include but also go beyond content. This is a very important part of entering a niche to begin with.
Speaking strictly of on-page issues, one of the steps I take before entering a niche is to review what the competition does, what they offer and make a list of what they do well and what they do less well. Then consider how you can turn their faults into what you do better. Clicking around a site gives you an idea of how it could fail a user and how you can turn that into a better offering delivered on your site.
| 8:42 pm on Jul 15, 2011 (gmt 0)|
|I'm not sure what Vanessa means by reviewing the content game plan. Definitely a review is in order as she recommends, and I don't mean to put words into her mouth but that review should include but also go beyond content. This is a very important part of entering a niche to begin with. |
which is what I've been saying all along!
| 8:52 pm on Jul 15, 2011 (gmt 0)|
Totally agree that content strategy isn't just about reviewing your current assets. (Blatant plug: my book is even about this idea. :)
About the first post in this thread, keep in mind of course that I haven't worked for Google for a long time, so anything from Matt and Amit (and Maile, John, etc.) is definitive from Google. I'm just providing my opinion based on what I've seen and my past experience.
| 9:13 pm on Jul 15, 2011 (gmt 0)|
Vanessa Fox: "Panda isnít simply an algorithm update. Itís a platform for new ways to understand the web and understand user experience."
In other words Panda is a social algorithm independent of the main. To test that...
- actively increase your sites social experience and measure for effect
- remove all traces of social from your site and hope there is a N/A to sites that don't push tweet and like buttons and that Panda evals don't get weighed in.
If you allow comments you want lots of them. If you allow tweet and share buttons they need to be used often. Stagnant features scream 'LOUSY EXPERIENCE' imo, it may be best not to have unused features.
| 9:28 pm on Jul 15, 2011 (gmt 0)|
I wouldn't say "in other words"... I don't know that Panda has anything to do with social engagement. (Maybe it does, but that's not what I was talking about.)
| 10:42 pm on Jul 15, 2011 (gmt 0)|
He he - I can imagine a lot of people jumping on your comments Vanessa being the closest thing to a statement from Google.
I would like to quote a bit from that article that should be fundamental to anyone who owns or runs a website:
|Letís say the content and the user experience are good for that page. Then you run into the issue of quality ratio of the whole site. The question then becomes if someone lands on your site and they like that page, but they want to engage with your site further and click around your site, does the experience become degraded or does it continue to be a good experience? |
Think about your website - not just "landing pages".
If you are building sites or writing content and creating pages to capture traffic from Google then you will not make it in the long run.
I have been saying this in all my comments and advice post-Panda - you can't just "write good content" to get some visitors from Google, the game has changed fundamentally.
That is why I believe that all people have been doing is trying to "simulate" a good page or site, rather than sitting back and looking at what they are really offering and how they could create something special.
It is like the whole "SEO" thing is shifting back to classic marketing and thinking about customers (a USP, a brand, content for visitors not search engines) once again.
I like that, I really do - I don't like the collateral damage for e-commerce sites - but I have no sympathy for affiliate or adsense/ads based sites as I think they need to be really special to be even in the game.
| 10:56 pm on Jul 15, 2011 (gmt 0)|
So Google is now the Social Science Searchengine? Yeah. It figures!
| 11:03 pm on Jul 15, 2011 (gmt 0)|
I see little evidence that increased "social for the sake of it" equals increased visibility on Google, and by that I mean simply adding feeds to twitter or your facebook wall. Embracing social media properly results in good things all round.
Why have people latched on to that? Google themselves have said they don't have access to the Twitter "firehose" or Facebook likes.
"User engagement and behaviour" is what they have access to - via Google Search, GMail, Google Accounts, Adsense, Toolbar, Chrome, ISP data.
Nothing to do with "social" - more to do with the opposite, the "user".
| 11:24 pm on Jul 15, 2011 (gmt 0)|
I'm sorry, but with all due respect I don't buy into this "social signal thing" It might be a minor thing but no more than that. The fact remains the much of the internet is not conducive to social signals. No one is going to "like" an article on how to treat Gonorrhea. They are not going to "share" a post on Midget Tranny #*$! and they are certainly not going to +1 it. One of the reasons the internet works is one can be more or less anonymous.
So what happens when there are no social signals? It needs to fall back on its standard algorithm. I also think this whole "brand" thing is foolish on Google's part. BP is a huge brand. Are you going to trust a damn thing they say about oil drilling safety?
| 11:27 pm on Jul 15, 2011 (gmt 0)|
|Nothing to do with "social" - more to do with the opposite, the "user". |
Don't cut that short... else why is there +1 and Google+
IT IS ALL ABOUT SOCIAL from here on out. That, and content, too. :)
| 12:15 am on Jul 16, 2011 (gmt 0)|
tangor, does Google+ have enough data this minute to create an algo or base rankings on it?
Will social be a massive factor in rankings going forward.
| 1:00 am on Jul 16, 2011 (gmt 0)|
Ummon, who said there is a "social signal thing"?
Who said that you need "likes" to rank an article about a disease?
Who said there is a "brand" thing?
At the moment I don't think your worries are an issue.
| 2:54 am on Jul 16, 2011 (gmt 0)|
No offense Swanson, I see your point as well, but I think they are and I'll tell you why. Google is using something as a measuring stick for user experience and Vanessa is talking about Panda being "a platform for new ways to understand" and the CEO is all over the news praising Google plus to no end. The only way to gauge user experience is to study the user, social gives you exactly that. Here is a likely scenario...
Google is watching Jane's online activities closely. Google has figured out that Jane enjoys her twitter account and publishes a blog in which she posts an average of 3 times a week. Her social 'rating' is excellent, she rarely posts duplicate content, gets good reviews and always credits the source. She's worth watching.
So what is Google watching? What Jane talks about! If they notice Jane searching for a surf board or toenail clippers and after visiting a site she wanders over to her blog and posts a link to the site she just visited there is a good chance that link is of high value, at least for Google's purposes.
A site that links to the exact same sites 50 other websites do, at roughly the same time, looks more like a paid advertisement and if Google can't make the link no added value can be associated to the link.
Far fetched? Not really, and Google plus will make it much more effective. That's why I said USE the social features to their fullest or consider not having unused features, it's better to send a great message than to she that you're not active, imo.
I'm wondering if it has reached further than that, yet. Does webmaster 3299239232 always perform a site:example.com search before posting an article that always looks strangely like the one on example.com? if so, he's sourcing other sites for content and not as interesting to watch, Google already watches those channels.
This would also go a long way in explaining how a site can rank well with nothing but text on the page, not having something like a proper title or sitemap isn't going to cause a penalty but having it wrong might appear like intentional misleading which might get a penalty.
Anyway, all I know is that we are bumping around in the dark now and less 'clutter' in the room(code, social signals, etc) is a good thing.
| 4:29 am on Jul 16, 2011 (gmt 0)|
So...is Vanessa speaking for Google, even indirectly, or is she giving her interpretation of what Panda is /might be? Looks like great advice regardless, especially to manage to make a living even without Google, but I'm curious.
True it has, at least so far. In fact many inferior pages are ranking because the other site is considered "good."
|I have been saying this in all my comments and advice post-Panda - you can't just "write good content" to get some visitors from Google, the game has changed fundamentally. |
Sitewide promotion and demotions do not end well when it comes to user experience. Let's see how Panda settles.
[edited by: walkman at 4:52 am (utc) on Jul 16, 2011]
| 4:35 am on Jul 16, 2011 (gmt 0)|
Vanessa is my Hero ;) Thanks for all this... a great interview!
| 6:22 am on Jul 16, 2011 (gmt 0)|
Anyone who actually bothered to analyse the SERPs and, especially, the sites that got promoted in Panda would see (in my niche at least) that these are often complete social retards. If Panda is driven by social, these sites wouldn't rank, never mind get a boost!
| 7:04 am on Jul 16, 2011 (gmt 0)|
Anyway, I bought Vanessa's book 3 days ago and it should be coming next week. It sucks to be a pawn in Google's game so I need to escape that.
Suggy is right, seen that in many cases so there's no pattern. My 3 non-pandalized sites don't even have a twitter acct, button or anything. However, you should not assume that Google has gotten it right (that your site sucks that much) even though your site should and can be improved. Plus, what Google say today is right, might be changed tomorrow.
Regarding social, I also have seen sites with tens of thousands of twitter followers get pandalized as well.
I think we need to wait to see Panda settle, some sites that did nothing are coming back, some of them are going again, some changed and got worse traffic, scrapers are #1 etc etc. So we don't know yet how much Google can do correctly, just we know what they might want to do.
| 9:35 am on Jul 16, 2011 (gmt 0)|
|tangor, does Google+ have enough data this minute to create an algo or base rankings on it? |
What does "behavioral" ads mean? All the data that Google has acquired over the last 10 years? If that's not "social" then I don't know what is... and Google+ is just the next step, it's not "new".
Google's attempt to return the best serp for the user is all about social... ie. lowest common denominator for a query... by all those who've made queries... and make a buck off it, too.
| 5:36 pm on Jul 16, 2011 (gmt 0)|
Walkman, I'm not speaking for Google. I'm just providing my interpretation/opinion.
It does seem to me that what I found to be true when I worked in Google search and *did* speak for Google remains true: that Google is looking to rank the very best results for searchers and is constantly evolving its algorithms to better do that.
On the one hand, as I said in the interview, I think that Google is looking to use every valuable signal on hand (originally, all that was really available was links... now lots of things are available, including social signals... lots more signals will continue to be available as the web evolves). On the other hand, you can't really make a direct link of something like x number of shares/RTs/likes = y position rise in rankings.
As someone mentioned earlier, of course, lots of sites rank well that don't engage in anything social and what I see is simply more that:
1) Google wants to rank great content
2) People like to share and interact with great content
Those "shares" specifically don't cause the ranking. What Google uses to figure out great content is going to keep evolving over time. But content that gets linked to/shared/etc. a lot can help you know what content your audience is finding the most valuable (not to mention, those links and shares raise the visibility of your site to new audiences).
I've never understand those who look at things like links and social engagement for their SEO value alone. The whole point of ranking well in search engines is to get more customers. If you can get more customers from other methods, then it's great that those other methods *also* might help with SEO, but it's even greater that you're getting more customers!
| 6:05 pm on Jul 16, 2011 (gmt 0)|
I think the SEO community in general still suffers from the mind-set introduced in the 90s - by the early versions of "SEO software" such as Web Position Gold. The paradigm they introduced was one of a kind of checklist, where each item could grant a certain number of points to the page and the page with the highest number of points was supposed to be the winner.
To whatever degree ranking algorithms worked like that in the 90s, they certainly have a more complex and holistic model today. The words "complex decision tree" were not even mentioned in the 90s - but the phrase certainly comes up around Panda.
| 7:38 am on Jul 17, 2011 (gmt 0)|
|1) Google wants to rank great content |
Then, frankly, it needs to try a lot harder. Panda doesn't seem to know great content from virtually no content or a 404 or even a long gone page (I'm amazed they don't find the number of dead links in the top 50 results more embarrassing)! Just search for a niche product and see what a... erm rich... and varied array of 'quality' content Google spews out!
And, I'm not even talking about big brands ranking skinny pages; there's plenty of unloved, no-name, made-for-adsense sites offering two lines of 'quality' content and a giant adsense block front and centre and nothing else. Pure quality!
Seems to me that Panda works on the rule of exception. It's very efficient at ruling websites out for some minor or imagined 'quality issue'. It's not very good at checking what's actually left (junk) before filling the serps with it!
Maybe Google's egg heads should get their noses out of their maths books and try doing some actual searches from time to time. They might feel less self-congratulatory about what they've achieved!
| 9:09 am on Jul 17, 2011 (gmt 0)|
|1) Google wants to rank great content |
That's by far not #1 on Google right now. While true I think they want to rank Places ahead of content in natural serps and they want to move searches to their social network via peer suggestions etc.
Perhaps Google is planning to turn their serps into a paid directory (wait, page one mostly is already) and move their 'suggestions' to plus1.
| 10:30 am on Jul 17, 2011 (gmt 0)|
|It does seem to me that what I found to be true when I worked in Google search and *did* speak for Google remains true: that Google is looking to rank the very best results for searchers and is constantly evolving its algorithms to better do that. |
I'd say that things have changed a bit but that's another subject. Google has gone to 'if slightly in doubt, remove 90% of traffic,' and reconsider in 5-6-12-24 months. Frankly they either are clueless to the pain they are causing, they enjoy the pain they are causing or this is part of their ma$ter plan. It's not incompetence, since they can easily reverse part of it if they wanted to.
Now we're getting even unrelated Google Books searches in many SERPS, as if other Google junk wasn't enough. It's easy to say "but your page/s still must suck," at some point we just aren't going to buy it...
| 11:15 am on Jul 17, 2011 (gmt 0)|
"Google wants to rank great content"
but it seems to be using that actual content itself less and less to determine that. As has been noted empty and non-content pages are ranking over substantial content and scraped content is ranking over the original.
| 9:38 pm on Jul 17, 2011 (gmt 0)|
Vanessa Fox seems to have put together a very good analysis of the panda update as far as I can understand, the best interpretation of it out of all the others.
I'm not summarising her, just expressing my own understanding of panda taking account of her views.
It seems to be that panda is looking at site level, not page level. It also appears that Google wants to evaluate site relevancy based not only on links, but also on other signals.
It also seems that Google has put together an algo that assumes it understands which sites are the experts in any area. but at the same time it wants to provide SERPS which have diversity in them - and this for me is key.
In my business area there are four or five sites which are the "experts" in the UK. They are all swish, huge enterprises which throw enormous resources into their websites. Expert photography that has nothing to do with the end result but very clever all the same. Stuffed full of SEO experts because this is a hugely profitable industry.
I seem to co-exist beside them and Vanessa Fox maybe provides the reason why - Google want diversity on their SERPs. My site is clearly more amateur, and therefore a good element in the diversity of the results? If you are up against the big guys, maybe it is possible to compete by providing a different type of site, albeit an amateur one.
| 1:47 pm on Jul 18, 2011 (gmt 0)|
One thing I've been curious about, doesn't Panda prioritize Google's view of what the web "should" be like? Not everybody is searching for articles or blog posts, and penalizing sites that aren't in the business of providing large amounts of unique original information (such as e-commerce sites or gaming sites) seems a bit unfair.
With the dominance that Google holds over the search engine market right now, anything they do has a huge impact on the web ecosystem. Considering their tendency for major and often careless moves with little warning, though, I wonder how much they're really taking into account the way their algorithm changes will affect sites and webmasters. Calling Panda "a platform for new ways to understand the web and understand user experience" makes it seem like they're dangerously aware of the influence they hold.
More importantly, there's a fundamental misunderstanding in place when sites are told that the best way to reach the top of the search results is to just have a good site. The misunderstanding is that there's not really any exact, objective description of what a "good site" is - there's general guidelines, but the exact details will vary from person to person. A site that's good according to the perspective of GoogleBot and the search engineers may be slightly different from our own personal feelings on what a good site should be, which is why people are going to be picking apart Google's algorithms as long as there are algorithms to pick apart. It's about far more than just trying to game the system.
| This 31 message thread spans 2 pages: 31 (  2 ) > > |