Forum Moderators: coopster
If it's a website, you can use a spider... It depends what the data is.
You can create a PHP script that does this:
grab data > parse data > insert/update SQL
then run that script as often as you feel it's necessary. But if the data is always changing, your copy will always be inaccurate to some degree.
The solution will depend on what data are you grabbing, and whether it needs to be guaranteed as accurate. weather? stock quotes? ticket prices? Gross National Debt? population of Zimbabwe?
One quick addition here:
I have a url that contains export feeds in 2 formats (.tab, .csv, and .pipe)...
will I be able to grab a .tab and import that directly or do I need to save it, import it, and then pull from it?
Any good resources out there on doing something like this?
Thanks again in advance!