Hello,
I'm new here and mostly came seeking help. We have a site on which we have taken great efforts to try and make as SEO friendly as we can.
<removed URL>
Nobody in my company is actually a SEO expert, so it's mostly all about documenting and trying things. We were getting some pretty decent results with gradual growing till around the 5th of April were we experienced a sudden drop. We scrambled and wondered what might have been the cause, thinking that maybe it was our <widget> finder that might have been considered "spamming".
<removed URL>
But then, this had been working without any issues for months before the penalty and other websites with similar content didn't seem to have any problems appearing in search engines, so we just weren't really sure. We did detect that our most frequent words were a mess because of the googlebot reading google maps incorrectly and making (Not found) be the most prominent word in our site, but we already corrected that and our most used sitewide words now make sense. Anyway, the drop stayed there till around the 20th of May or so when we started recovering our old volume. This seemed to coincide with the deployment of Panda 4.0, so we just kind of assumed that we got a break and our current content was considered appropriate for the new Panda iteration or maybe that google detected we had corrected our most frequent words problem.
Things have been steadily growing, we have added a new section dedicated to <sub-niche> and have noticed increasing traffic, nothing amazing, but nice numbers and people accessing the new content.
<removed URL>
This was going well till the 6th of June when again we have hit rock bottom on clicks and are barely appearing in Google again.
We really don't have any idea what we are doing wrong right now, we seem to constantly be getting mixed messages. The only thing that is clearly not up to snuff that didn't appear before is in webmaster tools where the google explorer is starting to show pages as being partially completed and with the following url's
<Mod note: Obscured URLs below and translated WMT from Spanish> URL: http:
//fonts.googleapis.com/css?family=Open+Sans:400,400italic,700,700italic
Type: Stylesheet
Reason: Denied by robots.txt
URL: https:
//www.google.com/uds/?file=feeds&v=1
Type: Script
Reason: Denied by robots.txt
URL: http:
//maps.googleapis.com/maps/api/staticmap?center=-1,-1&zoom=18&size=485x480&maptype=roadmap&markers=icon:
etc... Type: Image
Reason: Denied by robots.txt
URL: http:
//googleads.g.doubleclick.net/pagead/viewthroughconversion/etc...
Type: Resource
Reason: Denied by robots.txt
Thing is, they are all external google libraries we use either for font style, to get and format RSS feeds, for getting a gmaps static image and for add purposes. They are certainly not included in the robots.txt document, but I don't understand why they have become a problem now when a lot of these references have been there for half a year. Only 6 pages have popped up, but I expect the whole site would have these "partially completed" warnings potentially since for example, the font is used in the header of the site. We also had some references to some webservcies of our own, but I altered our project structure a bit so we have "public webservices" and "private webservices" to avoid this warning.
Anyway, getting back on track, this is really the only obvious thing we are seeing in webmaster tools which may justify taking a google penalty. Outside of that we are really walking blind and I would greatly appreciate it if anybody could give us some help on why this might be happening to our site.
Thank you all for your help and best regardss.
Oscar González Terrazo.
[edited by: aakk9999 at 3:01 pm (utc) on Jun 9, 2014]
[edit reason] Exemplified. Please no site URLs nor identifying details. [/edit]