Hello Webmaster Community,
I'm currently encountering a significant problem with Google indexing a large number of parameter-based URLs on my website, despite these pages being blocked in my robots.txt file.
Here's a brief context: my website only has 10,000 to 12,000 pages, but it seems like Google's crawler is treating many parametric URLs as unique pages and indexing them. Today the indexed pages count is 26K while actual number of pages is no more than 11K.
This situation is negatively impacting my website's SEO rankings, and I'm struggling to understand why these blocked URLs are being indexed. The number of these indexed pages continues to rise every 3 days.
Could anyone help me troubleshoot this issue and suggest any potential solutions? Why would Google index pages that are explicitly disallowed in the robots.txt file? Should I modify my current setup in any specific way, just for further information Canonical Tags have been implemented?
I appreciate any advice or suggestions, and I'm happy to provide more information if needed. This issue is currently impacting my site's performance, so urgent help would be greatly appreciated. Thanks