Issue
Using Logstash
to import data from Mysql
to Elasticsearch
, the sql track the update_timestamp
of a table, and scheduled every 1 minutes.
There are some special cases when the sql can't finish in 1 minute (e.g initial import into a new ES instance).
BTW, it seems that logstash will do import in small batches of 100k rows if the sql matches over 100k rows.
The question is:
If the sql can't finish in 1 minutes (aka. before next scheduled time starts), what will logstash do?
Will it:
- Skip the next scheduled task?
This seems to be the case, in my observation, but not sure. - Delay the next scheduled task, but never skip one?
- Or, something else.
- Skip the next scheduled task?
Solution
The jdbc input sets max_work_threads for the Rufus scheduler to one. If there is no available worker thread then trigger_queue does nothing, so that instance of the job will never be run. It will wait until the next time the queue should be triggered.
Answered By - Badger Answer Checked By - Cary Denson (WPSolving Admin)