Project:
View Issue Details[ Jump to Notes ] | [ Issue History ] [ Print ] | |||||||
ID | ||||||||
0047145 | ||||||||
Type | Category | Severity | Reproducibility | Date Submitted | Last Update | |||
backport | [Openbravo ERP] A. Platform | critical | have not tried | 2021-06-15 12:35 | 2022-08-19 14:52 | |||
Reporter | AugustoMauch | View Status | public | |||||
Assigned To | AugustoMauch | |||||||
Priority | normal | Resolution | fixed | Fixed in Version | PR21Q1.4 | |||
Status | closed | Fix in branch | Fixed in SCM revision | |||||
Projection | none | ETA | none | Target Version | PR21Q1.4 | |||
OS | Any | Database | Any | Java version | ||||
OS Version | Database version | Ant version | ||||||
Product Version | SCM revision | |||||||
Review Assigned To | ||||||||
Web browser | ||||||||
Modules | Core | |||||||
Regression level | ||||||||
Regression date | ||||||||
Regression introduced in release | ||||||||
Regression introduced by commit | ||||||||
Triggers an Emergency Pack | No | |||||||
Summary | 0047145: Import entries can be permanently left unprocessed, in Initial status | |||||||
Description | The ImportEntryManager uses a ThreadPoolExecutor to execute import entries concurrently. The ThreadPoolExecutor uses an ArrayBlockingQueue to store the tasks before they are executed. By default that queue has a maximum capacity of 1000, that value can be configured by setting a different value on the import.max.task.queue.size property. If a new Runnable is submitted to the ThreadPoolExecutor once its queue has reached the maximum capacity, and exception will be thrown and the Runnable will not be added. The problem is that even though the Runnable is not actually submitted, ImportEntryProcessRunnable that summited that runnable is caching it, and as a consequence it determines that is already being executed, and is not trying to resubmit it. When that happens, the import entries that are part of the reject Runnables will be permanently left in Initial status, until the server is restarted. For the issue to be reproduced, several factors must occur at the same time: - The import entry processing throughput is not able to keep up with the demand, so the queue grows until reaching its maximum capacity - The combination of (type of data, organization) is very high. Usually there is a runnable for each (type of data, organization) combination (each processor can define its own rules). If the combination was lower than the capacity of the queue the issue would not be reproduced. | |||||||
Steps To Reproduce | This issue was reproduced by running a performance test on an environment with following characteristics: - 711 organizations - 15 different import entry type of data The reproducibility was quite random until we reduced to 20 the number of available connections in the pool. This reduction resulted in the processing of the import entries being much slower, and as a consequence the import entry manager was not able to keep up with the demand, and the task queue reached its maximum capacity. Once the performance test was run under these conditions, some import entries were left in Initial Status, and they were not processed until the server was restarted. | |||||||
Tags | No tags attached. | |||||||
Attached Files | ||||||||
Relationships [ Relation Graph ] [ Dependency Graph ] | ||||||||
|
Notes | |
(0129516) hgbot (developer) 2021-06-15 14:20 |
Merge Request created: https://gitlab.com/openbravo/product/openbravo/-/merge_requests/398 [^] |
(0129539) hgbot (developer) 2021-06-16 07:35 |
Merge request merged: https://gitlab.com/openbravo/product/openbravo/-/merge_requests/398 [^] |
(0129540) hgbot (developer) 2021-06-16 07:35 |
Directly closing issue as related merge request is already approved. Repository: https://gitlab.com/openbravo/product/openbravo [^] Changeset: 65876a5b76db7fafd584978512f5989303e0fd5d Author: Augusto Mauch <augusto.mauch@openbravo.com> Date: 2021-06-15T14:23:59+02:00 URL: https://gitlab.com/openbravo/product/openbravo/-/commit/65876a5b76db7fafd584978512f5989303e0fd5d [^] Fixes ISSUE-47145: Import entries can be permanently left unprocessed The problem was that it was possible for ImportEntryProcessor to cache a Runnable assuming that it had been properly submitted to a executorService, when in reality the submission had failed (i.e. because the queue of the executorService was at maximum capacity). This has been fixed by ensuring that the runnable is not cached until ensuring that it has been properly submitted The visibility of the submitRunnable method has been changed to package, because there is only need for ImportEntryProcessor to have access to it. --- M src/org/openbravo/service/importprocess/ImportEntryManager.java M src/org/openbravo/service/importprocess/ImportEntryProcessor.java --- |
Issue History | |||
Date Modified | Username | Field | Change |
2021-06-15 14:14 | AugustoMauch | Type | defect => backport |
2021-06-15 14:14 | AugustoMauch | Target Version | => PR21Q1.4 |
2021-06-15 14:20 | hgbot | Note Added: 0129516 | |
2021-06-16 07:35 | hgbot | Note Added: 0129539 | |
2021-06-16 07:35 | hgbot | Resolution | open => fixed |
2021-06-16 07:35 | hgbot | Status | scheduled => closed |
2021-06-16 07:35 | hgbot | Fixed in Version | => PR21Q1.4 |
2021-06-16 07:35 | hgbot | Note Added: 0129540 | |
2021-06-16 10:35 | eugeni | Issue Monitored: eugeni | |
2022-08-19 14:52 | ivancaceres | Issue Monitored: ivancaceres |
Copyright © 2000 - 2009 MantisBT Group |