Many servers have a feature called a cron that allows you to schedule tasks to happen on your server at regular intervals – often multiple times a day. Your hosting provider may even have a web interface for setting this up, or you may have to set it up via the commandline using the crontab daemon. For more information, try a search for crontab documentation.
Importer allows the importing of content at regular intervals via a cron. Simply set up your cron to hit the Importer URL at a regular interval and the content will automatically be imported into your site.
If you look at the Importer Control Panel, you will see a list of your Import profiles. An Import profile is essentially the settings for importing data from a source, parsing it, and inserting or updating the data into ExpressionEngine. In this list will be a column labeled Cron URLs.
There are two Cron URLs for each Importer profile. Because imports can use up quite a bit of memory during processing, the Importer module has batch processing built in. This is automatically triggered if the number of items to be imported exceeds 100. So, the first URL will set up an import AND if the number of items is under 100, it will automatically do the import as well. If you expect that your imports will have more than 100 items, then you will ALSO need to use the second URL to insure the data is processed in batches. Batches are stored in the database and will be ready for processing immediately.
NOTE: Since you will have multiple batches, the Batch Processing Cron URL should be run more often. Also, if you prepare a cron batch, and somehow change the source data file and THEN cause the batches to run, the older data will still be in the batch queue and thus still loaded into the site.
Example Cron Setup
Create a new file called importer_cron.sh, which will create the importer, and add this to file (the hash code will be different for each import profile):
#!/usr/bin/env bash #wget Example /usr/local/bin/wget -q -t 5 "http://ee2.dev/index.php?ACT=24&hash=sWMrxOUj1haNrxOcT8fDZypSKWYuWeHH" #cURL Example /usr/bin/curl --silent --compressed "http://ee2.dev/index.php?ACT=24&hash=sWMrxOUj1haNrxOcT8fDZypSKWYuWeHH"
Create a new file called importer_cron_batch.sh, which will run the batch processing, and add this to file (the hash code will be different for each import profile):
#!/usr/bin/env bash #wget Example /usr/local/bin/wget -q -t 5 "http://ee2.dev/index.php?ACT=24&hash=sWMrxOUj1haNrxOcT8fDZypSKWYuWeHH&batch=yes" #cURL Example /usr/bin/curl --silent --compressed "http://ee2.dev/index.php?ACT=24&hash=sWMrxOUj1haNrxOcT8fDZypSKWYuWeHH&batch=yes"
Create two crons for these files:
0 0,12 * * * sh /Library/WebServer/Documents/ExpressionEngine2/importer_cron.sh */10 * * * * sh /Library/WebServer/Documents/ExpressionEngine2/importer_cron_batch.sh
The first runs the initial import cron twice a day at midnight and noon. The second checks for batches to run every ten minutes.