Opened 3 years ago
Closed 3 years ago
#2574 closed defect (fixed)
configurable size of gdal datasets cache in wcst_import
Reported by: | Dimitar Misev | Owned by: | Bang Pham Huu |
---|---|---|---|
Priority: | major | Milestone: | 10.0 |
Component: | wcst_import | Version: | 9.8 |
Keywords: | Cc: | ||
Complexity: | Medium |
Description
We need to allow configuring the number of open GDAL files that the cache in gdal_util.py holds, because too many files can cause issues with various limits on Linux (e.g. max 1024 open files usually, or too many threads as each gdal.Open seems to create 7 threads).
- wcst_import.sh needs a new option:
-c, --gdal-cache-size <size> the number of open gdal datasets to keep in cache in order to avoid reopening the same files, which can be costly. The specified value can be one of: -1 (no limit, cache all files), 0 (fully disable caching), N (clear the cache whenever it has more than N datasets, N should be greater than 0). The default value is -1 if this option is not specified.
Change History (3)
comment:1 by , 3 years ago
comment:2 by , 3 years ago
code to clear the cache
def _clear_gdal_dataset_cache(self): global _gdal_dataset_cache for _, ds in _gdal_dataset_cache.items(): ds = None _gdal_dataset_cache = {}
comment:3 by , 3 years ago
Resolution: | → fixed |
---|---|
Status: | assigned → closed |
Note:
See TracTickets
for help on using tickets.
At the end of https://doc.rasdaman.org/05_geo-services-guide.html#data-import a new section can be added: