I have a use case where I batch upsert many documents once per day, and then additionally upsert throughout the day.
I’m experiencing unexpected memory usage after upserting, both after the batch (using pipelining) as well as incrementally through the day. Image is redislabs/redisearch:1.4.10, clients are redisearch-py 0.7.1 and jredisearch 0.22.0. Any help would be appreciated. Here is a little script to demonstrate:
import redis
import redisearch
if name == ‘main’:
r = redis.Redis()
rs = redisearch.Client(‘index’, conn=r)
try:
rs.create_index([redisearch.TextField(‘field’)])
except Exception:
pass
while True:
rs.add_document(‘doc’, replace=True, partial=True, field=‘1 2’)
print(r.info(section=‘memory’)[‘used_memory’], end=’\r’)
``
In the time that it took to write this up, the used_memory is up to 13829320 and stays after stopping the script and debug reloading.
The DBSIZE consistently shows 4 keys as expected. Output of KEYS:
-
“ft:index/2”
-
“ft:index/1”
-
“doc”
-
“idx:index”
``
INFO MEMORY after script has been stopped:
Memory
used_memory:13808448
used_memory_human:13.17M
used_memory_rss:120729600
used_memory_rss_human:115.14M
used_memory_peak:68664976
used_memory_peak_human:65.48M
used_memory_peak_perc:20.11%
used_memory_overhead:841246
used_memory_startup:791360
used_memory_dataset:12967202
used_memory_dataset_perc:99.62%
allocator_allocated:15296824
allocator_active:19308544
allocator_resident:84508672
total_system_memory:2095869952
total_system_memory_human:1.95G
used_memory_lua:37888
used_memory_lua_human:37.00K
used_memory_scripts:0
used_memory_scripts_human:0B
number_of_cached_scripts:0
maxmemory:0
maxmemory_human:0B
maxmemory_policy:noeviction
allocator_frag_ratio:1.26
allocator_frag_bytes:4011720
allocator_rss_ratio:4.38
allocator_rss_bytes:65200128
rss_overhead_ratio:1.43
rss_overhead_bytes:36220928
mem_fragmentation_ratio:8.77
mem_fragmentation_bytes:106962176
mem_not_counted_for_evict:0
mem_replication_backlog:0
mem_clients_slaves:0
mem_clients_normal:49694
mem_aof_buffer:0
mem_allocator:jemalloc-5.1.0
active_defrag_running:0
lazyfree_pending_objects:0
INFO MEMORY after DEBUG RELOAD shows the same:
Memory
used_memory:13808496
used_memory_human:13.17M
used_memory_rss:131489792
used_memory_rss_human:125.40M
used_memory_peak:68664976
used_memory_peak_human:65.48M
used_memory_peak_perc:20.11%
used_memory_overhead:841278
used_memory_startup:791360
used_memory_dataset:12967218
used_memory_dataset_perc:99.62%
allocator_allocated:15586288
allocator_active:19595264
allocator_resident:97402880
total_system_memory:2095869952
total_system_memory_human:1.95G
used_memory_lua:37888
used_memory_lua_human:37.00K
used_memory_scripts:0
used_memory_scripts_human:0B
number_of_cached_scripts:0
maxmemory:0
maxmemory_human:0B
maxmemory_policy:noeviction
allocator_frag_ratio:1.26
allocator_frag_bytes:4008976
allocator_rss_ratio:4.97
allocator_rss_bytes:77807616
rss_overhead_ratio:1.35
rss_overhead_bytes:34086912
mem_fragmentation_ratio:9.55
mem_fragmentation_bytes:117722312
mem_not_counted_for_evict:0
mem_replication_backlog:0
mem_clients_slaves:0
mem_clients_normal:49694
mem_aof_buffer:0
mem_allocator:jemalloc-5.1.0
active_defrag_running:0
lazyfree_pending_objects:0
INFO MEMORY after FLUSHDB shows most but not quite all memory released:
Memory
used_memory:1220400
used_memory_human:1.16M
used_memory_rss:131559424
used_memory_rss_human:125.46M
used_memory_peak:68664976
used_memory_peak_human:65.48M
used_memory_peak_perc:1.78%
used_memory_overhead:841054
used_memory_startup:791360
used_memory_dataset:379346
used_memory_dataset_perc:88.42%
allocator_allocated:2672768
allocator_active:6660096
allocator_resident:97402880
total_system_memory:2095869952
total_system_memory_human:1.95G
used_memory_lua:37888
used_memory_lua_human:37.00K
used_memory_scripts:0
used_memory_scripts_human:0B
number_of_cached_scripts:0
maxmemory:0
maxmemory_human:0B
maxmemory_policy:noeviction
allocator_frag_ratio:2.49
allocator_frag_bytes:3987328
allocator_rss_ratio:14.62
allocator_rss_bytes:90742784
rss_overhead_ratio:1.35
rss_overhead_bytes:34156544
mem_fragmentation_ratio:111.55
mem_fragmentation_bytes:130380048
mem_not_counted_for_evict:0
mem_replication_backlog:0
mem_clients_slaves:0
mem_clients_normal:49694
mem_aof_buffer:0
mem_allocator:jemalloc-5.1.0
active_defrag_running:0
lazyfree_pending_objects:0