Forum Moderators: coopster
I have created a script to parse the database and then use fopen($url) to check if each of the images exists. This method is taking a long time to process, approx 3 hours for the entire database. Can anyone advise if there is a more efficient way of doing this?
Thank you in advance.
you could split it up
another idea might be the more often an image has been checked, the less likely it is that it has been changed or moved
this might allow you to select the urls that have never been checked and work up to the most checked.
just an idea, not sure it would help as I don't know how often images become invalid