Forum Moderators: Robert Charlton & goodroi
I run a large, well-established website that has been around for years, generating millions of visitors per month across millions of pages of documents. We have always been diligent about tracking all of our website metrics so that we understand user behavior, where our audience is coming from, and can use the data in order to improve user experience.
Recently we have been experiencing *very* erratic Google organic traffic which jumps up and down by 30%-80%. The cycle has now repeated itself 6 times over about 6 weeks time. While there have been times in the past where our Google organic traffic has increased and decreased, it always has done it in a measured manner; we have never before seen erratic behavior from Google.
Here are the traffic specifics:
•June 3, Google organic drops by 30% vs. normal
•June 4, Google organic traffic returns to normal
•June 9, Google organic again drops by 30% vs. normal
•June 17, Google organic returns to normal
•June 19 , Google organic again drops by 30% vs. normal
•June 27, Google organic returns to normal
•July 9, Google organic again drops by 30% vs. normal
•July 11, Google organic returns to normal
•July 12, Google organic again drops, but this time by 80% of normal
•July 13, Google organic returns to normal
While we are constantly in the process of refining our site, the only major change over the last couple months has been to our “related articles” component which does what it sounds like: if you are looking at article A, here are a handful of other articles that are highly relevant to the one you are viewing. Over time, we have been tuning the algorithm that generates these links so as to improve relevancy.
I have also noticed some other artifacts:
•Google bot spidering activity has increased, reaching a plateau of about 140% of pre-link change levels; on some days approaching 1 million pages/day.
•The number of page indexed in our Google Webmaster site map reports jumped by 12%.
Any ideas about what might be going on here?
Thanks!
Greg
[edited by: tedster at 8:03 am (utc) on July 15, 2008]
We are not doing any purposeful keyword stuffing of any sort, however we have over 5 million articles on our site and it is certainly possible that when we are pulling over "related articles" that the titles are very similar.
Is it normal to see organics go up and down repeatedly prior to a penalty?
Thanks
Such fluctuations can also be considered normal as google keep adapting new things in its algo.
I had some what similar pattern as yours and then finally got smashed on 3rd june and since then i figured so many stupid things which i never bothered before e.g duplicate meta titles.
Weird thing is after the traffic dropped off, I changed my homepage title and description which was picked up and shown in my 150+ ranking right away. Now that my page 1 rankings are back, the old meta title and description is too.
Definitely looks like a roll back at least on the terms I'm following.
Martin Ice Web - only a handful of the sites I own or manage are in WMT. A couple of our high-profile non-WMT sites are approximately where they used to be, around 1-3 for certain key phrases.
Another site, in WMT and with sitemap, is about 10-20 down on where I would expect and its sister site (not in WMT) has taken over some of its slots despite being less relevant, with the main site nowhere in top 50. The site title belongs to the sister site, the description is taken from text on the primary site (this is technically feasible in this case but nowhere linked as such). If google is supposed to be using the site title as a major keyword it's screwed up big time (again).
(These are UK sites hosted in UK, all on same server; searched using google.co.uk using both Web and UK searches.)
Our own primary company site has been hovering around 14-16 for the past few days (at least) for one key phrase. It used to be around 2-5 but I can't say when it began dropping. But one thing about the listed result: it's got the wrong description! Entirely. The text that google claims is our description has not been used on the site for several years and no longer exists online, although we know it's still archived in dmoz at #1 for the old site's keywords.
The incorrect description could account for our lower position if google is parsing that actual (old) page, since it never had much in the way of relevant keywords (change of use).
Last night I added the site to WMT with a sitemap. We'll have to see if that makes any difference. Probably lose it altogether! :(
A thought about dmoz: I don't suppose google are trying to relate new site content to dmoz? If so then it's seriously out of line - that dinosaur has been dead for years!
WMT pages may be hit harder than other pages
I think that's extremely unlikely. Google wants webmasters to use their service - and all it does is give you reports and a way to communicate as the authenticated responsible person. In other words, it's a report only, and not a cuase of anything.
Depending on how you generate a sitemap you may introduce spidering problems, expose duplicated urls and so on - that I can see possibly causing some trouble. But not just a WMT account on its own. And I've got scores of them as background experience, too.
but u have to admit that this would be a easy way for google to filter SEO site from "normal" sites? Google pushes WMT extremly for the last months. I think its blue-eyed to think that google does not use the collected data from WMT ( not only for crawling ) for their rankings, while google is intendent to collect all available data to each site. You get something from google but u do also a complete sellout. Maybe its only a conspiracy theory but its worth a thought.
[edited by: Dave_Hybrid at 10:20 am (utc) on July 15, 2008]
To except it, has s.o. suffered, with his site not registered in WMT?
[edited by: Martin_Ice_Web at 12:46 pm (utc) on July 15, 2008]
I have two similar <snip> sites located on the same server. Over the last week or so both sites are cycling between their original posistions and #3 posistion for the most competitive terms, always pushing the other site back to its old posistion. Very strange!
Any ideas on this? I wish it would settle down and give me a true ranking! Either way im pleased that i'm hitting the top spots - I just want to know which client I can tell the good news to!
Thanks.
[edited by: Receptional_Andy at 1:00 pm (utc) on July 15, 2008]
[edit reason] moved from another location + removed specific industry [/edit]
but u have to admit that this would be a easy way for google to filter SEO site from "normal" sites?
What site isn't SEO'd these days? Google cannot have a problem with SEO'd sites in general, just bad SEOs.
I think its blue-eyed to think that google does not use the collected data from WMT ( not only for crawling ) for their rankings,
I cannot think of a reason they would. Webmaster tools shows US what Google already has on our site, what extra information do you give them that would affect rankings? Sure they could see which other sites you own for the purposes of cross-linking but surely they already have filters for that such as whois data? (which would be more reliable too ...)
Maybe its only a conspiracy theory but its worth a thought.
No, I actually feel that my time has been wasted but I didn't want people to start pulling their info off of GWT because for some that is the only site error knowledge tool they are using.
This has been going on for quite a while now.
They don't deny it, my opinion is they will probably punish your sites if you use GWT. Just imagine a manual review of your site, the reviewer doesn't like your site, so he tanks all your sites at once using the GWT data because he is lazy at that time.
Everything and anything is possible.
SEOPTI - because Google doesn't confirm something does not mean that they are or are not using it. Basing SEO and marketing decisions on a 'guess' - that they are using GWT for penalties or ranking - needs to be substantiated with more solid evidence before I ever use that theory in diagnosis. The fact is tons of sites have tanked that do not use GWT or GA...
Anything and everything is possible
Yes, but when I am troubleshooting and diagnosing rankings, I stick first with what is "probable". You wouldn't replace all the wiring in your house the minute the power goes out, would you ? :)
but u have to admit that this would be a easy way for google to filter SEO site from "normal" sites?
How? Just because they have a record of your xml sitemap and some information? They are more likely to get the information they need by measuring all of the factors they currently measure in order to assess your website - link build rate, number of links, threshold, age, topic strength, LSI. The information is not difficult for Google to get - it gets the information for billions of pages daily.
So why is there no Google rep who will say, yes we don't use GWT for penalties?
First of all, Google reps don't hang out here as often as they did a few years ago.
Second, do you really expect Google to respond to every conspiracy theory that gets trotted out in a Web forum?
They don't deny it, my opinion is they will probably punish your sites if you use GWT.
If they were going to be that illogical and arbitrary, it would make more sense for them to punish sites that don't use the Webmaster tools, on the theory that such sites have something to hide.
Establishing cause and effect is not as easy as saying "this happened after that." It's called the post hoc ergo propter hoc [skepdic.com] logical fallacy. That fallacy creates magical thinking and mythogloy - including SEO mythology.
I say we should look at our ranking challenges with fresh eyes, without all the assumptions that we think are true. Google has changed something in the ranking criteria, and we're not sure what that change is. A drop in ranking may not be "a penalty", but rather just a failure to do well with regard to some new measure or other. In other words, the url just doesn't rank as well as it did, and there may not be a penalty involved at all.
A drop in ranking may not be "a penalty", but rather just a failure to do well with regard to some new measure or other. In other words, the url just doesn't rank as well as it did, and there may not be a penalty involved at all.
Many of us are facing more competition than we did in the past, too--from more sites, and from more pages. If those pages are better than ours, have more quality inbound links than ours do, or have some other advantage (including whatever changes Google might be making in its algorithm or filters), then we can't expect our pages to rank as well as they did previously.
Or, to use an analogy: If 1,000 people are running in the marathon, it's probably going to be tougher to win than it was when 100 people were running.
I have now changed the anchor text on those 2 bought links and hopefully I will come back, otherwise I will probably remove those 2 paid links completely. If google doesn't penalize sites for buying links, only selling, then this would be an example of site that has been penalized for to many links to quickly with the same anchor text.