Auto batching #490
Replies: 5 comments 14 replies
-
Hey @thijndehaas, I would like to know the parameters you use with your Meilisearch, as you specified Thank you and have a good day! |
Beta Was this translation helpful? Give feedback.
-
Hello @Kerollmops , Apologies for the late reply. However this still seems to be the case in the latest version where te auto-batching is enabled by default. Somehow tasks even seem to stay in queue even if the queue is inactive for a while. The hanging tasks get processed when I add new tasks to the queue. This is the startup command (I added the /home/meilisearch/meilisearch \
--http-addr=127.0.0.1:7700 \
--env=production \
--db-path=/home/meilisearch/data.ms \
--master-key=HIDDEN \
--debounce-duration-sec=3 \
--max-indexing-threads=20 |
Beta Was this translation helpful? Give feedback.
-
Since my problem with the 20 tasks might be solved. Processing jobs still seems slow and (I guess not in bulk mode since only one tasks is processing at a time): Can the processing prevent bulk mode because I add a lot of delete/addition tasks after eachother? |
Beta Was this translation helpful? Give feedback.
-
I have resolved the problem with my code preventing bulk indexing. I also submitted my bulk queue for removing documents when submitting my bulk queue for adding documents. Because most of our products are actually not on stock we remove more products than we add when doing a full sync. Here is a very simplified example class: class Example
{
private $additionsQueue;
private $removeQueue;
public function add($item)
{
$this->additionsQueue = $item;
$this->syncItems();
}
public function remove($itemId)
{
$this->removeQueue = $itemId;
$this->syncItems();
}
public function syncItems()
{
if (count($this->additionsQueue) > 5_000) { // Now changed to 10_000
$this->index->addDocuments($this->additionsQueue);
$this->additionsQueue = [];
$this->index->removeDocuments($this->removeQueue); // The problem
$this->removeQueue = []; // The problem
}
if (count($this->removeQueue) > 50_000) { // Now changed to 100_000
$this->index->removeDocuments($this->removeQueue);
$this->removeQueue = [];
}
}
} A task queue example: [batch 1] documentAdditionOrUpdate
[batch 2] documentDeletion
[batch 3] documentAdditionOrUpdate // because the task from batch 2 was added it is not in batch 1
[batch 4] documentDeletion // because the task from batch 3 was added it is not in batch 2
[batch 5] documentAdditionOrUpdate
[batch 6] documentDeletion
[batch 7] documentAdditionOrUpdate
[batch 8] documentDeletion
[batch 8] documentDeletion
[batch 8] documentDeletion
[batch 8] documentDeletion
[batch 8] documentDeletion
[batch 9] documentAdditionOrUpdate
[batch 9] documentAdditionOrUpdate
[batch 9] documentAdditionOrUpdate
[batch 9] documentAdditionOrUpdate
[batch 9] documentAdditionOrUpdate
[batch 9] documentAdditionOrUpdate
[batch 9] documentAdditionOrUpdate
[batch 10] documentDeletion
[batch 11] documentAdditionOrUpdate // because the task from batch 10 was added it is not in batch 9 |
Beta Was this translation helpful? Give feedback.
-
Just as status update, I have updated to Meilisearch v0.29.0rc2 and I'm running my script with 3x more instances and still can't create a large tasks queue because they are processed really fast. It seems like something changed in a good way 😁 |
Beta Was this translation helpful? Give feedback.
-
I know auto batching is still in beta but I have some feedback/questions about it.
I was wondering when auto batching will start indexing. It seems like as long as I'm actively writing data to the API the queued data will start processing very slow. When i'm not adding new data, the remaining queue will start indexing fast.


There are already a lot of processing tasks but the server has plenty of resources left to do more:
F.Y.I. This is currently not running in docker anymore but I still use the same directory.
For now this won't be a problem but I'm wondering what will happen when we deploy this with for example 10.000.000 products instead of 600.000.
Beta Was this translation helpful? Give feedback.
All reactions