Forum Moderators: martinibuster
$random = rand(1,2);
if($random == 1) {
// 300x250 Rectangle
} else {
// Responsive Linkblock
} This policy change doesn't explain why on the page some of the ads spots are filled,
What percentages in coverage drop have you seen because of this update?
but once they are crawled, the problem should be over. At least I would think so.
[edited by: riccarbi at 4:14 pm (utc) on Jan 10, 2018]
If page is half filled with ads, then new policy explains nothing.
if our uncrawled urls are not kept in the index for at least a month, then we are all doomed.
it must be extremely backed up and so much slower than I think Google thought it would be.
The two crawlers are separate, but they do share a cache. We do this to avoid both crawlers requesting the same pages, thereby helping publishers conserve their bandwidth.[support.google.com ]
How the crawler can determine such compliancy? If i write on a page "this page doesn't contain #*$!ography" does the crawler read the world "#*$!ography" and marks the page as unsuitable for AdSense and/or AdWords?
If I update some info in a page (i.e. a change of date of an event) does AdSense needs to re-crawl the page? And how long and how much computational power are required for it to recrawl millions of such pages a day?
There is no way to tell from a screen cap you need to check your stats in the "Ad Networks" report.
2) They say reindexing takes 1-2 weeks on their doc at the bottom. Which to me is so slow.
3) Your absolutely right about this. If I was a news site or blogger, I would be so mad that my breaking news articles or posts aren't serving ads immediately. Whats the point anymore!
) Googlebot and adsense crawler dont coincide
AdWords and DoubleClick Bid Manager have adopted more restrictive bidding on ad requests coming from URLs that are uncrawled.