Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Confused about 500 server error

         

jediviper

12:55 pm on Mar 26, 2020 (gmt 0)

5+ Year Member Top Contributors Of The Month



So I have discovered through Semrush thousands of internal pages that are Broken with a 500 server error.
I tested some of them and the pages working normally, some other are not available any more.
One of them that is working normally, I tested it through the Url Inspection tool of the GSC and the result was it was discoverable through the sitemap, but it's not indexed as it's blocked by the robots.txt.
Next step was to test the Robots block and the result was that there was NO block as the Googlebot was allowed to found it normally!

So what is happening? Google says through one tool that the Robots file is blocking one page, then Google says through another tool that is not blocked.
What should I do for all these thousand pages with the 500 server error?
How can I know if the performance of my site is affected or not according to Semrush?
BTW, the "Coverage->Submitted URL blocked by robots.txt" report includes only 7 links.

lammert

6:02 pm on Mar 27, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The easiest way to find the cause of 500 Server Error errors is to look in the log files of the server. There should be more information. External tools like GSC only know that the error happened, but not why it happened.

lucy24

8:08 pm on Mar 27, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You do not have a 500 server error. You have an incorrect report from Semrush. It’s appropriate that the question was posted in the Google subforum, because it is far more common to have to reassure people that suchandsuch scary message you got from GSC can safely be ignored.

It is, of course, remotely possible that at the very moment Semrush did their crawl, something was wrong with the server. An even remoter possibility is that something in your htaccess (probably not possible in config) caused the server to return a 500 response to all requests from this specific visitor.

some other are not available any more.
What does “not available any more” mean? Have you removed them (preferably with a 410 response), or is there an unexplained problem with URLs that are supposed to be there and you couldn’t reach them?

jediviper

5:45 pm on Mar 28, 2020 (gmt 0)

5+ Year Member Top Contributors Of The Month



Thank u both for your answers.

@lucy24+ some of the product pages were probably removed by the dev team, so that's why they are not available any more. Don't ask me why is this happening, it's just something that I can't control.

Any idea about the Robots problem? What is happening with some of the pages and they appear (at the Url Inspection tool) as blocked, but then again they seem to be fine at the Robot tester.

not2easy

6:44 pm on Mar 28, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Any idea about the Robots problem? What is happening with some of the pages and they appear (at the Url Inspection tool) as blocked, but then again they seem to be fine at the Robot tester.
Google is known to consider a page/URL as 'blocked' if some resource on that page is blocked, even when the only thing blocked is access to AdSense resources on their own site. In the URL inspection tools you should be able to click to determine what they are calling 'blocked' - though they have made it more difficult that in the old GSC to see what to click for that.

If robots.txt is blocking css or js files for example, they call the page blocked. Not a major concern but occasionally you can find some accessory script file that is inadvertently blocked in robots.txt. If the pages in question use 3rd party tools - those may be blocked at the 3rd party source.

jediviper

12:05 pm on Mar 30, 2020 (gmt 0)

5+ Year Member Top Contributors Of The Month



Thanks @not2easy for your explanation