Mod's note: I changed the title of this thread, when publishing it, from "Great sites don't necessarily rank say's Google" to "Search engines need time & other signals to confirm a site is 'fantastic'". This is less sensational, but IMO, more accurate to JohnMu's comments.
|What does this say about folks working hard to getting out of Panda recovery recognition? Does this resonate with anyone's experience? |
I'm not sure what the thrust of this question is... whether this about a Google deficiency in ranking sites or a truth of SEO that we've known for years. I don't think that it should be news that it takes time to build good links, and to evaluate them.
Also, Panda isn't about backlink spam... that's Penguin.
To put some things back in context... the SER article that discusses part of the JohnMu quote is here....
Google: Fantastic Web Sites Not Enough To Rank Well
Search Engine Roundtable
Jun 20, 2012
|...John is answering a specific question so taking him within that context is important, which I am not doing. |
But clearly he is saying that having a great web site is one thing. Google still needs to verify other factors, off page factors, to validate the site is truly fantastic to others.
I myself feel that Barry's context becomes more valuable if we also look at the JohnMu quote in its context. Since links to Google's forums have a habit of breaking, with or without the WebmasterWorld link redirect script, here's the thread on which John is making his commment... note the title... along with a url that may work for you, and the bulk of John's comment.
Competitor might be doing a bad SEO attack against us. Strong traffic drop
|From what I can tell, your site is still fairly new - with most of the content just a few months old, is that correct? In cases like that, it can take a bit of time for search engines to catch up with your content, and to learn to treat it appropriately. It's one thing to have a fantastic website, but search engines generally need a bit more to be able to confirm that, and to rank your site - your content - appropriately. |
That said, if you're engaging in techniques like comment spam, forum profile link-dropping, dropping links in unrelated articles, or just placing it on random websites, then those would be things I'd strongly recommend stopping and cleaning up if you can.
I think the first paragraph of John's comment says what it says, quite clearly, that it will take Google some time to sort out the quality of a site based on linking signals. It likely also is intended to mean that it takes time for natural backlinks to build. In the past, it's been generally agreed among good SEOs that it might take at least a year or two to build a critical mass of real good quality backlinks.
Additionally, as I read between lines, John is commenting, in the context of the Google forum thread, about whether it was negative SEO or questionable link building tactics that have caused weak rankings. That's being debated on the Google thread, as it had been previously been debated here. IMO, John's saying that he's seeing a bad link profile for the OP's site... and that if the site is actually good, it's going to take better quality linking signals to make it rank. That may or may not have implications about whether negative SEO can hurt a new site with an emerging link profile (as opposed to a bad link profile).
Regarding the original post in this thread... whatever one wants to make of the fact that good backlinks take time both to get built and to get sorted out (and that's getting more complicated), I'm not sure there's anything new here about Panda. I don't think Panda is about backlinks.
If [Whitey's] original post is about time and patience... IMO it's a pretty obscure way of making the point.
Edit note: Fixed Google forum url, and clarified which "original post" I'm talking about. I have no trouble with JohnMu's comment.
[edited by: Robert_Charlton at 6:56 pm (utc) on Jun 21, 2012]
Search engines need to establish a baseline for how often a site is updated and how deep the content (clicks away from the home page) is on the site. From experience, (as in this is exactly the problem I am working on for a country level search engine), this can take up to a year of continual, scheduled, spidering. A new site has a very specific link profile and link acquisition rate. The process of a new site gaining new inbound links is, where there is no outside SEO campaign involved, more like a process of accretion in that links are gradually added an the site goes through a few bursts of new inbound links. What JohnMu seems to be hinting at is that comment spam, the use of meat bots and off topic/out of area links tend to flag new sites as potential problem sites.
|What does this say about folks working hard to getting out of Panda recovery recognition? Does this resonate with anyone's experience? |
And in a few months time they'll be saying something with an ever-so slightly twisted slant!
Sites that have been around for 15+ years do not suddenly go bad overnight which is what has happened for many of the Panduin cases I have seen. Just why is it that Bing can tell the quality and originals whereas Google cannot? Certainly in my widget sector Bing can.
For everyone involved in forum comment spam, read this and understand.
|That said, if you're engaging in techniques like comment spam, forum profile link-dropping, dropping links in unrelated articles, or just placing it on random websites, then those would be things I'd strongly recommend stopping and cleaning up if you can. |
"or just placing it on random websites" hmm issent that what the web is about that you naturally get links from all kind of sites, I always thought that links from related sites are the links google should look at.
Links seem to have their own Social Network.
Seems like this happens after each "quality" update. Too bad Google suffers from amnesia and can't recall the good sites that ranked well for years with good content BEFORE the update occurred.
From what I am observing post penguin, Google now needs a new update called "know it all" that gets rid of all the new high ranking "answer" type sites. It's ridiculous!
6 months. That is the amount of time needed for a site to truly rank where it should be without lack of age holding it back completely.
I have found this to be true time and time again. I think google even said something about this as well.
Age is a factor, it 100% is. Adding content during those 6 months is also important in my experience.
Also, the kind of content is important as well. In my experience, visually appealing content (nicely formatted, with images, a nice flow etc) gets you much more bonus points than if you just have paragraphs of ugly text (even if its unique).
I have played with all types of different scenarios. From putting up sites with no design, text only, to full blown graphic heavy websites. The nicely laid out ones with graphics placed within content (and even videos) do much better than the sites that have no visual elements.
I've seen the occasional site start ranking much faster than six months. It's usually something that catches media attention for some reason.
so what he is really saying is that the sandbox really does exist?
Good gosh golly.
|but search engines generally need a bit more to be able to confirm that, |
The spokesperson does not speak for "search engines", he can only accurately speak for the google.
They don't need much time to rate a page anymore, perhaps 2-3 days tops, at least for an established site.
As evidence I can offer affiliate links. I have several pages that do well with a particular affiliate program however that program occasionally lowers its payout for several days in a row. While the payout is low I turn the affiliate links off. When the payout returns I re-activate the links.
Google sends this page traffic when the links are off, a good deal of traffic(it's a great page), but when the links are active the traffic is cut 75% within 1-2 days and doesn't return until 2-3 days after the links are removed.
Couldn't happen without frequent sampling.
|I've seen the occasional site start ranking much faster than six months. It's usually something that catches media attention for some reason. |
Oh of course you can rank a site faster than 6 months. Heck a site can rank as quickly as a week. But it still will be held back due to its lack of age. Once it starts establishing age, it will rank better and better as long as its content/links/exposure warrant it.
A site with at least 6 months of age will be much more stable in the rankings and its just much easier to rank. A newer site needs much more stronger signals to rank quickly.
@Sgt_Kickaxe: are you saying then that linking to this site lowers your pages's ranking in the Google SERPs? And when you remove the link the page rises again?
I'm saying that the addition of 7 links to this particular page causes a repeatable disruption in traffic. I can't say for sure it's because of the site I'm linking to, it's a well known and respected site, but I have several months of this data to confirm it's happening.
You can add negative signals into the mix, affiliate links(even unpaid with nofollow applied) being one of them.
|Sites that have been around for 15+ years do not suddenly go bad overnight which is what has happened for many of the Panduin cases I have seen. Just why is it that Bing can tell the quality and originals whereas Google cannot? Certainly in my widget sector Bing can. |
Seems to me Bing takes a bit longer to index page than Google does, but once it gets around to them, it has a pretty good, stable idea of where they belong in the rankings. I haven't encountered dramatic shifts in Bing rankings.
I was wondering if google+ may be now one of the factors of ranking a new site
I've launched a couple of new sites recently and they managed to rank and get traffic within a little over a month. I did no link building of my own, just announced things via twitter and facebook, others did the linking if any, and the traffic started coming via Google search without too much time delay. Not high volumes of traffic, but enough traffic to see the site was indexed and the proper keywords were ranking for search terms.
What more can you ask for until the next Panda, Penguin or some other B&W Animal Update squishes you like a cockroach?
The current Google seems more committed to not having bad sites rank, even if they are not ranking the best sites. The niches I follow are mostly populated with big sites with a brief generic page or two on the subject of the search, while sites focused on the search term with sometimes 100s of pages of content are deeper in the the results.
I guess Google wants to be able to claim their results are improved because there are fewer cases of terrible spammy sites ranking(which I find to be the case). Is this worth losing very good smaller sites and opening the door for negative SEO? So far Google seems to be very happy with the results.
|I've launched a couple of new sites recently and they managed to rank and get traffic within a little over a month. |
I have done the same with similar results. I am actually finding it a bit easier with new sites and my time spent on new sites has yielded much better results than my time spent on trying to figure out how to get older sites to return to previous levels.
I think this thread kind of reiterates the historic 'sandbox effect' which used to be a very common question and/or answer here on WebmasterWorld. I believe there still is a sandbox effect which is refreshed less often than Panda, so I'd bet many new sites that think they're affected by Panda when really, they've been affected by both a premature Panda and still inside some kind of sandbox combined with the honeymoon phase.
|I guess Google wants to be able to claim their results are improved because there are fewer cases of terrible spammy sites ranking |
I agree and disagree. I find that their results seem spammier now than they used to because many low-quality sites (albeit with really good content spinners along with bought links) are ranking well now. The most obvious spam sites are now gone. Though they seem to come and go.
OT: On that note: For a company that is apparently so obsessed with social networking (successful services based on real-time info), they sure do take their dear old time updating and engaging themselves.
|I believe there still is a sandbox effect |
Oh, how I hate that term. A true sandbox is a predefined time of limited functionality. Never existed and doesn't exist. But I guess given the history of the topic, the confusion is going to stay.
|But clearly he is saying that having a great web site is one thing. Google still needs to verify other factors, off page factors, to validate the site is truly fantastic to others. |
.... like the brand / trusted link profile
|Singhal: Donít do it man. |
Everyone says I need more links. How do links improve the quality of the site? I donít want to play this game and I donít want to do this.
Cutts: What matters is bottom line. Links are a part of search Ė they represent online reputation.
So building and investing in a fantastic website after Panda probably isn't going to cut it without a solid trusted link profile improvement. And a link profile that doesn't engage over optimization per Penguin. Many folks clawing there way back from Panda may have overlooked this. The holistic approach and possible contradiction of understanding was the intended context of my opening thread - sorry if it appeared cryptic.
Is it me or is this the final reveal of what the infamous Florida update was about, with the alleged "sandbox".
Prior to that sites could rand immediately #1 for any term, whenever the Google Dance started.
I also believe that some industries (keyword sectors) may have a longer duration until a new site will see rankings.
Oh and Tedster - promise you there was a sandbox effect. I have seen tactics that used to work to get around it.
Have you ever seen a site rank #1 for 'Buy Viagra' in days? But using the same tactics with a new domain - never ranks.
Certainly there was a delayed ranking effect - but "sandbox" was and is a terrible, misleading label for it. Got one thing, it implied a set time and that was not the case. It made lots of webmasters very confused about what Google was actually doing.
Pretty much like today, unfortunately. When lots of people say the same thing over and over, then webmasters assume that it's true. Happens even if the idea is a distortion. And then webmasters waste their energies, get frustrated, etc, etc.
Oh right, yes. But that is what google wants - everyone who is trying to game them - confused.
Where as the sites that are the real deal, just get on with it, making their sites better and better for their users - which naturally builds more links - the right links.
Well, many people have been using strategies to game Google and didn't even realize that's what they were doing. They learned their "SEO" third and fourth hand and never understood how far out-of-date they were. It's all hit the fan now.
I remember one time in the very early days of Google, before they introduced host crowding, when a site I built (it was pretty "fantastic" for those days, I guess) took all ten positions for my target query term. I was embarrassed - I hated it, and that was the first time I ever "de-optimized" a site. I'm not interested in grabbing whatever I can get - and I never was.
I think this comment from John Mueller about needing time and other signals to assess a site, is worth sitting with. I think anyone who does that it should see their antiquated approach to SEO undermined, and in a big way.
|I guess Google wants to be able to claim their results are improved because there are fewer cases of terrible spammy sites ranking(which I find to be the case). Is this worth losing very good smaller sites and opening the door for negative SEO? So far Google seems to be very happy with the results. |
This may look great to a Google engineer to but to someone trying to find information it's a waste of time when the first sites in the listings have the same plain vanilla superficial data. It's interesting to note though that the sites which are ranking are all well established ones, even though the pages in question are fairly new. This shows that Google places more weight on how much they trust a site rather than how valuable or unique the data is.
Trying to build a better website with better content than the competition is therefore of secondary importance. Establish a brand and try to rank for everything even remotely connected to it is a more successful strategy at the moment.
Should Google concentrate more on the quality and trust of a page, rather than a website? Perhaps it will come, their machine learning is still at a very early stage.
[edited by: superclown2 at 7:34 am (utc) on Jun 22, 2012]
| This 44 message thread spans 2 pages: 44 (  2 ) > > |