As the title states - I disallowed some pages via my site's robots.txt file. These are pages that aren't meant for search results and were potentially contributing to crawl budget issues.
2 days after publishing the change to the robots.txt file, all of those pages showed up as "Good" under the report you find in Google Search Console, here: Experience > Core Web Vitals > Desktop or Mobile > Good URLs
I'm confused by this. Has anyone else seen this?
It's been about a week like this. Will this stick? Why is this happening? When I inspect each of the URLs in GSC the platform shows that they are blocked via robots.txt, so I don't get why it's simultaneously saying those pages are "Good" for CWV, since it isn't crawling them.