python - Store Big Dictionary Restrict Memory -


i have extremely big dictionary need analyze.

how dictionary come existance?

the dictionary pivot table of log file. have snapshot of inventory everyday , right have snapshots past month.

each snapshot looks this:

2013-01-01 apple 1000 2013-01-01 banana 2000 2013-01-01 orange 3000 .... 

and then, group records product name , plan time series analysis later. output have looks this:

{  apple:[(2013-01-01,1000),(2013-01-02, 998),(2013-01-03,950)...],  banana:[(2013-01-01,2000),(2013-01-02, 1852),(2013-01-03, 1232)...]  orange.... } 

as know, assuming have years , years of inventory snapshots , wide inventory breadth... dictionary turns out huge. whole 'grouping' process happens in memory , size of dictionary exceeds memory limit.

i wondering how restrict memory usage specific amount(say 5gb , don't want disable server normal usage) , work on disk.

here similar question mine following 'best voted' answer, memory still eaten after change loop number real 'big data' size.

so example doesn't kill memory appreciated , speed not import me.

(note, there several ways optimize data structure dictionary size reduced but... inventory snapshots not periodic , of products have different number of snapshots 'matrix' idea may not work)

at point, suggest stop using dictionary , import sqlite3, or you're going reinventing wheel implementing optimizations databases have.

to started quickly, elixir decent , practical orm.


Comments

Popular posts from this blog

html - How to style widget with post count different than without post count -

How to remove text and logo OR add Overflow on Android ActionBar using AppCompat on API 8? -

javascript - storing input from prompt in array and displaying the array -