Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Seeking insights on addressing indexing issues impacting our website

         

Scorter

5:45 pm on Apr 29, 2024 (gmt 0)



Our website has been experiencing challenges with indexing, leading to diminished visibility on search engine results pages (SERPs). Despite our efforts to optimize content and technical aspects, it seems that certain pages are not being properly indexed by search engines. We've checked robots.txt, optimized XML sitemaps, and ensured content quality, yet the problem persists.

We're looking to engage with the community to gather additional strategies and recommendations for resolving these indexing issues effectively. What are some advanced techniques or lesser-known factors that could be influencing our site's indexation? How can we systematically diagnose and address these issues to improve our website's visibility and performance on search engines?

Any insights, experiences, or recommended tools would be greatly appreciated as we work towards optimizing our website's indexing and enhancing its presence in search engine results.

Thank you in advance for your valuable input!
<snip>


[edited by: not2easy at 7:49 pm (utc) on Apr 29, 2024]
[edit reason] Please see TOS [webmasterworld.com] [/edit]

SEO Learner 04

8:40 am on Apr 30, 2024 (gmt 0)

Top Contributors Of The Month



The recent update of Google HCU has kicked out 94% of the websites due to their poor content.. i think you should have to go through the content quality of your website... Our experts will recommend the best..

universenet

1:25 pm on Apr 30, 2024 (gmt 0)

Top Contributors Of The Month



For many websites google have indexed pages in webmaster tools and number of indexed pages grow, but pages are removed from google search, just "tactics" from google

OldFaces

4:33 am on May 22, 2024 (gmt 0)

10+ Year Member Top Contributors Of The Month



Yeah Scorter I'm sorry to chime in and share my POV. We've struggled with index for years - actually for over a decade. We'll manually submit pages via Search Console that are 100% unique (they are about subjects that don't have an online footprint or are hidden behind paywalls on other sites) and Google will index it for a while, but often then de-index it.

Like really...Google would rather have no pages indexed about this subject (you can do a google search for this exact subject and find absolutely NOTHING about it).

There's LOADS of reasons why. It could be 'quality index' as Glenn Gabe has pointed out (that is the aggregate quality of all pages on your domain or sub-directory), a lack of 'brand authority' that doesn't justify every page you publish get indexed...to stuff way outside your control like Google point blank saying they want to 'crawl less' of the internet and (rightly so) don't want to index everything on the internet.

In the beginning they would index everything they could find....now decades later the internet is just too full of content.

There are services that charge to index your pages using the Google Indexing API that is supposed to be used only for job posting and I think events.... You could try that...but I don't think it wise myself.

Eva1995

5:36 pm on May 23, 2024 (gmt 0)



When you've ruled out robots.txt and XML sitemaps, it might be worth digging into your internal linking structure. Check if there are any orphaned pages or pages with excessive noindex tags that could be complicating indexation.