| This 317 message thread spans 11 pages: < < 317 ( 1 2 3 4 5 6 7 8 9 10  ) || |
|Big brands do not have the upper hand - Matt Cutts|
| 8:41 pm on Mar 12, 2013 (gmt 0)|
|Big brands cannot do whatever they want. They look at value add, etc. Faster, better, better UI, content, etc. |
It is weird, Google does take action on big sites and big sites often do not like to talk about it. So it happens a lot. [seroundtable.com...]
Live blog interview with Matt Cutts.
How are members seeing those quality signals playing out in the SERP's compared to "smaller" brands.
| 1:23 am on Apr 24, 2013 (gmt 0)|
btw - on penalty brand bias inputs, more granular advise to a big brand site [ Mozilla ] that just got manually penalized:
|John [Mueller] added that in these cases, Google tries to go as "granular as possible with our manual actions." So in this case, Mozilla is not fully penalized, just the sections or pages that have this spam on it. [seroundtable.com...] |
... back to paid / organic brand symbiosis
| 9:15 am on Apr 25, 2013 (gmt 0)|
AdWords: There is a big barrier to entry for small businesses:
Complexity. One way to overcome this is: Building a site for small businesses of a specific local service. If 100 small businesses in different towns offer the same service together they can afford to engage an AdWords and SEO expert to make the site generate new customers. The difficulty is in selling such a concept to small entrepreneurs.
| 3:50 pm on Apr 25, 2013 (gmt 0)|
That's a great idea, instand1 - and not just for small businesses looking for local exposure. Many sites could probably benefit from forming co-ops and putting their ad budgets together for a shared return.
| 2:26 am on May 4, 2013 (gmt 0)|
Worth noting in the context of brand from a long established member? I thought so :
I'll put this out there for discussion.
On April 24 2012, my traffic tanked severely. I'm not sure if the forum rules allow me to post a link to the graph, but the "search impressions" line from Google Webmaster Tools was at around 600,000 from March 1 (the earliest I had data when I did the screenshot) until April 1. On April 1, the impressions got really weird. They cycled up and down between 600,000 and 1,000,000 per day. That didn't correspond with any external activity in my site's niche (it is a sport). Then on April 24, bam! Impressions dropped to 250,000. Again, that didn't correspond with any external activity in my site's niche. It was about a 30-40% drop from the previous year. My site has been around since 1998, so I know my traffic, and this clearly was a penalty. Exactly on April 24. Penguin? Sure seems like it. I wasn't totally devastated because my site has a firm reputation and many repeat visitors, but it impacted traffic by 20-50%.
As my sport went into hibernation for the summer, the impressions dropped even lower, down to around 100,000 to 150,000 per day. Traffic was down by almost 50% from the prior year, though part of this was because the top league of the sport that my site focuses on was on the verge of cancelling their season.
Then, on October 13, bam! The impressions skyrocketed to 450,000. It can be argued that October is the time of the year when my traffic normally picks up, but the increase was too sharp for this to be that kind of increase. It was clearly a recovery.
One thing I had noticed was a hallmark of my penalty was that pages where my site should have been #1 (because I'm the only page on the topic) were coming in at #11. Not all pages, but long-tail pages without external links to them. My sense is that pages with a good number of external links were able to overcome the penalty. The difference between the affected pages and the non-affected pages may also have been the amount of information on the page - I was never really able to figure out the reason for the penalty. Still, I felt the penalty was sitewide. On October 13, that phenomenon disappeared and my longtail pages were again appearing at the #1 position instead of #11.
All summer I had posted repeatedly in the Google forums and the (mostly rude) posters there mostly focused on my content even though the penalty was on April 24. Consequently, most of the changes I made were to my content, site architecture, and internal navigation. I did ask a few sites that had linked to me blog-roll style to remove their links, but I never felt like my backlinks were a problem because I have so many of them and they are all natural.
To be clear, I have never bought, sold, or traded links, so this was a false-positive hit, in my opinion. My primary theory on why I recovered is that my site had a non-typical backlink profile which falsely tripped Penguin, and that on October 13 Google rolled out a fix that corrected their error. My secondary theory (which I hope is not true) is that there is a traffic threshold level for Google's penalties, and that from April to October, due to the seasonality of my sport, my traffic dropped below the threshold and lost its ability to overcome a penalty which might still be lurking out there.
I installed Google Analytics on July 1, and have been monitoring various factors since then. One thing that puzzles me is my breakdown between Organic/Direct/Referral. These three numbers have remained relatively constant from July 1 to the present. When I was in the penalty phase, they were about 62% Organic, 24% direct, and 14% referral. When the penalty lifted, they were 66% Organic, 24% direct, and 10% referral. The only variation I see in these numbers is from weekday to weekend, when my direct traffic increases due to people viewing my site on their mobile devices (mobile traffic goes from 35% during the week to almost 50% on weekends). I was expecting to see a much larger shift in those numbers considering that Google was now referring 30-40% more traffic post-penalty than pre-penalty.
TLDR: I wanted to put that out there - I was undoubtedly penalized on Penguin day, and my traffic undoubtedly returned on October 13.
1 year anniversary of penguin, no recovery [webmasterworld.com...]
I'm flagging this post as a possible identification of branding playing a role in a recovery, or insulation from a penalty. This isn't a major brand, I assume, but it may signal a threshold in the algorithm that responded to the pick up of "brand activity", which in conjunction with some tidy up work, pulled the site back.
There's some juicy inputs that follow from members, which kinda supports some of the hunches in this thread.
|Martin Ice Web|
| 2:17 pm on Jun 1, 2013 (gmt 0)|
I posted it in "Googler Matt Cutts Quote about internal website links will not cause you any sort of trouble" but i thought it is worth of digging a bit deeper into this one.
Big competitor, went through all panda/penguins without any trouble but gained for every query in our niche #1-#2 spots.
He is linking from every widget page to related/unrelated pages on his ecom shop. All a follow, all a dynamic, changing with every reload.
The site is good, from a users point of view, although it is confusing because its not sorted as a user would expect to.
The shop is using categories for the widget but presents this category widgets as an serp list. The user has now to pick some options on left site navigation to refine the search. Now what astonished me, is that all the serp sites do have the same title! So for a category with about 800 widgets u have about 80 pages unfiltered with widgets and the same title. Adding all the refined pages with the same title is getting into 200 pages with same title. All pages have robots follow, index.
Now, the best is, that all pages can be viewed in gallery r list view. The only thing is, that the canonical tag links to the page itself.
google has all the pages in index!
How could that be, when there is not a bonus for brands?
| 8:58 pm on Jun 3, 2013 (gmt 0)|
All I'm going to say about this is:
Amazon uses Black Hat SEO and spam techniques to get links to their pages.
We've all seen it. It's no secret, yet their own website still appears at the top of the SERPS with very thin pages.
| 9:42 pm on Jun 3, 2013 (gmt 0)|
All I'm going to say about this is: :)
When you have the time and money to dress your site for Google, you're going to have a good day.
(We should dress for our audience, not for Google - no longer realistically possible for some)
| 6:03 am on Jun 12, 2013 (gmt 0)|
Danny Sullivan interviews Matt Cutts @ SMX
|DS: Why is Panda large-brand focused? |
MC: Itís not large-brand focused.
DS: Why not?
MC: We look at all the data we have. We donít target brands.
|DS: Does Google have different ranking factors for different industries? |
MC: We have looked at topic-specific ranking. The problem is itís not scalable. Thereís a limited amount of that stuff going on ó you might have a very spammy area, where you say, do some different scoring.
What weíre doing better is figuring out who the authories are in a given category, like health. If we can figure that out, those sites can rank higher.
DS: How many different categories?
Ouch. Do we have a conflict here that needs clarification?
Does Google take commercially stronger business' and elevate them as brand, even if the same or superior user experiences are offered by others, even affiliates?
|DS: Earlier in this session, Matt mentioned affiliates and black hats in same sentence. Does Google view affiliates as spammers? |
MC: I regretted saying that as soon as it came out. There are a lot of good affiliates that add value. But by volume, we tend to see more affiliates that are not adding value. Hipmunk is an example of a site that adds value as an affiliate.
| 10:02 am on Jun 12, 2013 (gmt 0)|
what does authority mean these days? it seems to be the sites that most people choose to use of their own back. For example a shopping search would see most users going to amazon. However is amazon an authority for the product? does it have the best information or unique content for the product? seems to me the meaning of authority has changed from in-depth product information to most likely to sell that product. In the old days the algo searched for authority based on informational parameters, these days it identifies authority based on user patterns.
| 11:27 am on Jun 12, 2013 (gmt 0)|
|what does authority mean these days? |
|....is figuring out who the authorities are in a given category, like health. If we can figure that out |
@santapaws - well they haven't figured it out, so until they do we won't know, and they'll be criticism's I guess with some of the default branding listings we constantly see. Much more work required here IMO in specific verticals. Makes me wonder if there's a big manual review component involved.
| 11:38 am on Jun 12, 2013 (gmt 0)|
|what does authority mean these days? it seems to be the sites that most people choose to use of their own back. For example a shopping search would see most users going to amazon. However is amazon an authority for the product? does it have the best information or unique content for the product? seems to me the meaning of authority has changed from in-depth product information to most likely to sell that product. In the old days the algo searched for authority based on informational parameters, these days it identifies authority based on user patterns. |
Google's version of authority seems to equate to size - the bigger you are, the more authority you have. I'm sure people here will show counterexamples, and I know there are some that exist - but that's the general trend I see. If you're big, you're an authority. Pinterest is a great example of this. They're big and popular, therefore you see them ranking well for all kinds of keywords.
| 11:42 am on Jun 12, 2013 (gmt 0)|
@ColourOfSpring - You acknowledged this, but let me stress it: That doesn't match what a lot of us big / popular sites are seeing. Though I can only speak from the limited perspective of myself and another 20 or so (big sites)
If by 'big' and 'popular' you mean lots of pages, and lots of visitors. 1 million pages / 120k visits a day.
| 11:51 am on Jun 12, 2013 (gmt 0)|
... hmmm , I suspect there is a manual organisation component. If they don't know what's authority, and they definitely don't because they say that, how can they let the algorithm completely loose.
They must have asked their regional editors in the key markets Matt is referring to, against an editorial template to nominate the top "x" sites in key verticals and then weight those domains. It also makes sense strategically, while Google "organises" the transition of smaller sites through this transition [ if they haven't gone bust already].
Look, of course I don't know, but now Google is on record as saying "they are figuring it out". That's a pretty open admission.
| 11:57 am on Jun 12, 2013 (gmt 0)|
|That's a pretty open admission. |
Yep, it really is. I wish they'd talk with us more - we did see this coming way-back. IMHO: They had already lost track of the signals the day the 'disavow' tool reared its ridiculous head.
Overall, if you create this much chaos, only people with deep pockets will survive it. I think they know that now.
| 12:04 pm on Jun 12, 2013 (gmt 0)|
Google's algorithm is so complex, I don't think even Matt Cutts can fully grasp what it is doing. I don't fault Matt for this.
It's terrible to say, but in an effort to create artificial intelligence, the overlapping layers of scoring can create an end result that looks like someone threw darts at a board to determine who ranks where.
| 12:10 pm on Jun 12, 2013 (gmt 0)|
|if you create this much chaos, only people with deep pockets will survive it |
Well I absolutely applaud the communication in the last couple of weeks and breakout at SMX. The feedback is unprecedented.
But the flip side is small / medium sites will struggle to organise their limited resources to define what they are going to do, especially with many having bad revenue depletions. More refinement of those communications with tools is needed to make it a level playing field for brands v aspiring small/medium sites.
On the Google side of the equation, Matt has spoken about producing results for users, and how all sites need to focus on things like UI and different assets to gain an edge for ranking and preserve the long term for users to return. Google is also faced with rapid acceleration in appliance usage e.g. smartphones and massive consolidations and resource growth in key verticals. Matt emphasises in a video that Google themselves took a financial hit to get things right for the future - so they are not exempt themselves.
No wonder all the SEO's were clapping hands at SMX. They have a story to tell their clients - spend money to adapt to the new order. It's enough to make them happy.
Google needs to communicate it's requirements more openly for ANY business to cope with the changes coming, large or small. Perhaps they recognised this recently, i don't know.
[edited by: Whitey at 12:48 pm (utc) on Jun 12, 2013]
| 12:18 pm on Jun 12, 2013 (gmt 0)|
What I see in the travel sector is a preference for sites with some level of editorial control. That is, UGC sites as the infamous TripAdvisor rank well because, presumably, user-input corrects errors. The other big players have major offline presences whether in the form of guidebooks or newspapers or they are official sites that some government or local body stands behind.
| This 317 message thread spans 11 pages: < < 317 ( 1 2 3 4 5 6 7 8 9 10  ) |