|Drop to 30th page of Google after Joomla upgrade|
| 7:23 am on Jul 26, 2013 (gmt 0)|
I was ranked on the 1 st of google for a highly requested keyword and suddenly I moved to the 30 th page, it has been over a month and I don't move.
I have changed nothing on my website and don't understand why this is happening ( the only thing I did was an upgrade of joomla ).
I use joomla and I had the 1.5 version when I was ranked on the 1 page. I decided to upgrade to to the 2.5 version and there was a bug when I did it because google starting indexing hundreds of pages when my website only has about 50.
In the webmaster tool I now have about 800 pages instead of 50. I managed to block google from indexing more by putting a robots.txt and reinstalling the 1.5 version of joomla and the number of indexed pages is VERY slowly decreasing ( after about a month and a half google has deindexed about a 100 pages but there are still about 650 to deindex )
My question :
Do you think that google moved me to page 30 because of the joomla problem I had and how long do you think it is going to take to bring me back on page 1 ?
Do you think I will need to wait until google deindex all the pages that are not needed ( the 650 ) or is it going to be a faster process ?
Has anyone on this forum experienced that issue ?
| 9:08 am on Jul 26, 2013 (gmt 0)|
It would make no sense for Google to lower your rankings based on version of Joomla. I suspect that the problems you encountered during the upgrade however may very well have caused some problems.
|I managed to block google from indexing more by putting a robots.txt |
A robots.txt file entry will not stop Google from indexing urls, it will only stop Google from crawling the pages. The result is that Google will index the urls with a message saying they could not crawl the pages.
If Google is indexing pages you do not have then you need to make sure that those urls do not exist on your site and that they return a 404 error code when visited.
It may take months or years for Google to stop crawling those urls and reporting them in GWT. I occasionally see urls reported as 404 errors in GWT that have not existed since 2007.
| 9:41 am on Jul 26, 2013 (gmt 0)|
|A robots.txt file entry will not stop Google from indexing urls, it will only stop Google from crawling the pages. |
Whilst this statement as above is true, the robots.txt block may stop Google discovering other URLs that are generated from the blocked page. It often happens that from one leaked URL Google discovers hundreds/thousands other URLs.
A good example of this would be a page where a visitor is offered an URL with a date as a parameter. If you stop Google visiting the initial page by robots.txt then Google may not "discover" multiple URLs for the same content where only the date is different - and where a link to another page will change daily with only date parameter being different.
Back to OP question - it is entirely likely that a technical error in Joomla upgrade has resulted in ranking drop.
- have your URLs remained the same after the Joomla upgrade?
- is your site structure (navigation, and navigation type) remained the same or has it changed?
- I presume that these additional URLs indexed point to existing pages, but URL has some extra parameters (or changed parameter order)?
As JS_Harris said above, it would be better if these pages serve 404, or even better, 410 Gone (Google will drop these pages faster). Hence, is it possible that you change your .htaccess to serve 410 based on some unique string in URL for these pages? Of course, you would then need to remove robots.txt block in order for Google to request these URLs and sees 410 Gone.
With regards on how long it will take Google to sort out your site - it depends. In general, I have noticed that whilst the drop is usually "off the cliff" chart type, the climb back is "mild upward slope" type of chart. As these "new URLs" that Google has discovered are "low value", Google will not crawl them that often, hence it will take some time to re-process these URLs. And further, this also depends on your crawl budget, i.e. how often Google crawls your site and what is the average number of URLs that Google usually requests.
| 10:49 am on Jul 26, 2013 (gmt 0)|
In your messages you mention that the robots.txt will not stop google from indexing but just from crawling but in my robots.txt I have a Dissalow : / and Noindex : / don't those 2 command stop google for indexing and crawling ?
Concerning the 404 error I don't think it is possible create a 404 error in joomla , is it ? because joomla works in such a way that with an article ID you can create 1000 of address see this example where I create my own website address within the joomla website ( I added my own text )
Concerning my website :
The URLs remained the same
The site structure remained the same
The additional URLs point to existing page that is true but with the wrong parameters ( For example I have an article that appears under the wrong section of my website and it created and new URL )
Do you think I can do anything to make the process faster by maybe doing something additional to what I have already done ( maybe creating 404 errors if it is possible for the pages that I want deindexed or should I just wait ? )
[edited by: phranque at 2:03 pm (utc) on Jul 26, 2013]
[edit reason] fixed url display [/edit]
| 11:50 pm on Jul 28, 2013 (gmt 0)|
You have a few things wrong with your robots.txt:
a) "Dissalow" is misspelled, it should be "Disallow"
b) If you have "Disallow: /" then you are blocking the whole site from crawling. In this case Google will drop you in a matter of days. You need to disallow a specific URL pattern in order to stop Google crawling your duplicate pages
c) "Noindex: /" is not a valid robots.txt directive. To noindex a page, you must use robots meta tag on the page itself <meta name="robots" content="noindex">, but in this case you need to let Google crawl the page in order to see this meta tag
As far as I know, Joomla is commonly hosted on Apache server and Apache uses .htaccess You can edit .htaccess to set up 404 or 410 response codes or to set up a 301 redirect to a correct URL. I also seem to remember that there are some Joomla plugins for SEO friendly URLs and if set up correctly, it will not expose Joomla internal dynamic URLs, so perhaps you should investigate whether you could use this.
|Concerning the 404 error I don't think it is possible create a 404 error in joomla , is it ? because joomla works in such a way that with an article ID you can create 1000 of address |
Has the article appeared in the wrong section of your website just because you upgraded Joomla or because of some other changes?
|The additional URLs point to existing page that is true but with the wrong parameters ( For example I have an article that appears under the wrong section of my website and it created and new URL ) |
Have you looked into using canonical link element for your article, where you define what should be your canonical (main) URL for an article? You may want to search for Joomla canonical plugin and investigate whether using it would solve your multiple URL problems.
I would however suggest that you do these changes on a test site (which is blocked from Google via robots.txt or via password access) and only when you are happy with results, to port these changes to your live site, otherwise you may do further damage to your site.
| 12:26 am on Jul 29, 2013 (gmt 0)|
|As far as I know, Joomla is commonly hosted on Apache server and Apache uses .htaccess You can edit .htaccess to set up 404 or 410 response codes or to set up a 301 redirect to a correct URL. |
It shouldn't be necessary to use htaccess-- or, for that matter, IIS equivalent-- to issue a 404. A 410, yes, because that's an intentional statement from the site. But a 404 is the default response when the server can't find a file.
One thing you can and should do is look at the "URL parameters" area of gwt. There should already be a list of all the parameters they've found. Each one should be classified as, first, "affects page content" or "doesn't affect page content", and then among those that do affect page content, there's an array of "ignore" flags. If you find any listed parameter that you don't use at all, you may need to flag it manually as "doesn't affect page content".
You should also make sure that if you yourself request a garbage URL-- either a nonexistent path, or something with bogus parameters-- you get a 404 "Can't find this page" response. If you don't, you will need to consult joomla experts, because this is a CMS issue.
| 7:28 am on Jul 29, 2013 (gmt 0)|
If you upgraded from Joomla 1.5 to 2.5 I suspect that the URLs for your existing content changed.
Make sure you also upgrade your htaccess file to the latest version. It is delivered as "htaccess.txt" or somesuch name and must be renamed in order for it to work.
| 8:01 am on Jul 29, 2013 (gmt 0)|
Does anyone know how to create a 404 error in joomla for a range of article ID ( ex : article ID containing the numbers between 0 and 9 ) and could someone give me the code to write.
Because after reading all your answers this is the only solution I have.