Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Having SEO problems. Lost on how to act

         

OscarTerrazo

2:22 pm on Jun 9, 2014 (gmt 0)

10+ Year Member



Hello,

I'm new here and mostly came seeking help. We have a site on which we have taken great efforts to try and make as SEO friendly as we can.

<removed URL>

Nobody in my company is actually a SEO expert, so it's mostly all about documenting and trying things. We were getting some pretty decent results with gradual growing till around the 5th of April were we experienced a sudden drop. We scrambled and wondered what might have been the cause, thinking that maybe it was our <widget> finder that might have been considered "spamming".

<removed URL>

But then, this had been working without any issues for months before the penalty and other websites with similar content didn't seem to have any problems appearing in search engines, so we just weren't really sure. We did detect that our most frequent words were a mess because of the googlebot reading google maps incorrectly and making (Not found) be the most prominent word in our site, but we already corrected that and our most used sitewide words now make sense. Anyway, the drop stayed there till around the 20th of May or so when we started recovering our old volume. This seemed to coincide with the deployment of Panda 4.0, so we just kind of assumed that we got a break and our current content was considered appropriate for the new Panda iteration or maybe that google detected we had corrected our most frequent words problem.

Things have been steadily growing, we have added a new section dedicated to <sub-niche> and have noticed increasing traffic, nothing amazing, but nice numbers and people accessing the new content.

<removed URL>

This was going well till the 6th of June when again we have hit rock bottom on clicks and are barely appearing in Google again.

We really don't have any idea what we are doing wrong right now, we seem to constantly be getting mixed messages. The only thing that is clearly not up to snuff that didn't appear before is in webmaster tools where the google explorer is starting to show pages as being partially completed and with the following url's

<Mod note: Obscured URLs below and translated WMT from Spanish>

URL: http://fonts.googleapis.com/css?family=Open+Sans:400,400italic,700,700italic
Type: Stylesheet
Reason: Denied by robots.txt

URL: https://www.google.com/uds/?file=feeds&v=1
Type: Script
Reason: Denied by robots.txt

URL: http://maps.googleapis.com/maps/api/staticmap?center=-1,-1&zoom=18&size=485x480&maptype=roadmap&markers=icon:etc...
Type: Image
Reason: Denied by robots.txt

URL: http://googleads.g.doubleclick.net/pagead/viewthroughconversion/etc...
Type: Resource
Reason: Denied by robots.txt

Thing is, they are all external google libraries we use either for font style, to get and format RSS feeds, for getting a gmaps static image and for add purposes. They are certainly not included in the robots.txt document, but I don't understand why they have become a problem now when a lot of these references have been there for half a year. Only 6 pages have popped up, but I expect the whole site would have these "partially completed" warnings potentially since for example, the font is used in the header of the site. We also had some references to some webservcies of our own, but I altered our project structure a bit so we have "public webservices" and "private webservices" to avoid this warning.

Anyway, getting back on track, this is really the only obvious thing we are seeing in webmaster tools which may justify taking a google penalty. Outside of that we are really walking blind and I would greatly appreciate it if anybody could give us some help on why this might be happening to our site.

Thank you all for your help and best regardss.

Oscar González Terrazo.

[edited by: aakk9999 at 3:01 pm (utc) on Jun 9, 2014]
[edit reason] Exemplified. Please no site URLs nor identifying details. [/edit]

Clay_More

9:43 pm on Jun 9, 2014 (gmt 0)

10+ Year Member



If you search Google using the site: operator, what results do you receive? Examples, not specifics.

Do you have a sitemap pointing to pages disallowed in robots.txt?

lucy24

12:56 am on Jun 10, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Do you have a sitemap pointing to pages disallowed in robots.txt?

I thought he was saying that everything listed as "Disallowed in robots.txt" is in fact a google page meaning that neither his own sitemap nor his own robots.txt can possibly have any effect. It would be pretty nervy* of g### to robot-out its own files, and then cite this as an excuse for inability to fully render the page.


* Either nervy or hilariously incompetent-- and I don't think the latter is an accurate description of today's Google.

Clay_More

4:37 am on Jun 10, 2014 (gmt 0)

10+ Year Member



Those do appear to be Google pages.
But, Webmaster tools is suggesting the pages belong to OP's site.

I'm not sure what happens when a page is disallowed in robots.txt but there are on-site links pointing to the same page. That's why I asked.
While Google isn't hilariously incompetent, Webmaster tools has it's own bugs. Thus, I asked about the site operator to get an idea of how Google has the site indexed and whether there are any glaring issues.

Did you have a suggestion related to the O.P.'s issue?

OscarTerrazo

8:34 am on Jun 10, 2014 (gmt 0)

10+ Year Member



Hello, thank you both for taking interest.

If I use the site: (MyWebsite) I get my indexed pages, having a quick glance through nothing strange. The sitemap doesn't have any of the pages in the robot.txt since they are either webservices or pages dedicated to logged user content. I still did have to remove some webmethods restricted in the robot since aside from the urls I posted, I had a couple, but those were my mistake (though I still don't understand why it was never a problem for the last year).

The google urls I'm linking are not references in the robot.txt nor are they referenced in the sitemap, since really, they aren't part of my site so I really shouldn't have any influence on them. It baffles me that this now seems to be an issue when it was never the case before, but now that they are popping up.

I just don't know how to approach it outside of removing all my external google references. This isn't a problem with the font, but it is for the remarketing aspect of adwords or for the images of maps and for getting feeds.

This is also supposing these partially completed warnings that webmaster tools is telling me are even relevant since I'm still not sure our google drop even has anything to do with this or of it's a content thing...

lucy24

5:09 pm on Jun 10, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



starting to show pages as being partially completed

Isn't "partial" the word they use when talking about rendering a page? There's a very recent thread about this added feature, with particular reference to javascript. Does the message now show up somewhere else-- that is, somewhere other than in response to a fetch-and-render request?

not2easy

6:41 pm on Jun 10, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Lacking more info I only suggest to see if all the <script tags are in the head of the pages and think about moving them to just before the </body tag.

Also scripts used all over the site should be in an external .js file and cached so it doesn't take a new call to load it fresh on every page.

Clay_More

9:04 pm on Jun 11, 2014 (gmt 0)

10+ Year Member



With the limited information presented, I think I'd ignore the robots.txt issue for now. You say it's only 6 pages, and if that is a fairly small percentage of your pages, I'd watch it, but not obsess over it for now.

Your dates of change don't correspond with known algorithm changes so the highest probability is on-site changes that have created problems. If the maps and keyword issue was discovered after April 5th, subsequently fixed and traffic started improving, that would make sense. It wouldn't be likely you would see improvement from Panda 4.0 if there had not been an earlier iteration that "hurt" you.

The second, current drop could well be related to the addition of the section about the sub-niche. There used to be some discussion about problems when a site was "fiddled with" too much. Usually small but frequent changes.

Keeping in mind this is mostly guesswork based on limited information, I'd slow down the "trying things" and let Google absorb the changes to date. Sometimes the best thing to do is just let it rest for a while.

OscarTerrazo

8:24 am on Jun 12, 2014 (gmt 0)

10+ Year Member



Well, for some arbitrary reason, yesterday our web recovered it's traffic and is showing up again high on searches. Really confused by what happened since the only thing I did was update the robot's sitemap and pretty much that's it.

Sigh, so frustrating how blind I am to what is actually happening underneath... And I'm still weary of another unexpected dip.

I've checked the robot.txt issue again and only those 6 pages are still showing outside of some testing we did here, so it hasn't increased.

Anyway, thank you all for your help. I'll come back and add some more information if I find out what the whole partial thing was about.

netmeg

12:08 pm on Jun 12, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Sigh, so frustrating how blind I am to what is actually happening underneath... And I'm still weary of another unexpected dip.


Well don't disappear now that it's fixed; stick around and read the forum, get less blind, and you might get some insights.