python - Error saving and loading a list of matrices -
i have list "data_list", , save in order load in script. first of converted in array, in way:
data_array = np.array(data_list)
then saved it:
np.savez("file", data_array)
then, in script want access "file"; so:
a = np.load("file.npz") b = a['arr_0']
i used code until 2 weeks ago , worked fine. in these days trying work program, ends error identified in line
b = a['arr_0']
"file" 300 mb file. strangest thing has stopped work. idea can happened?
ps: give information. list contains 180 matrices 511x511. each matrix contains decimal numbers (i tried create 180 matrices of zeros, , error occurs in same way). if reduce number of matrices, script works fine: in particular down 130 matrices ok, while program doesn't work. here report error message
b = a['arr_0'] file "c:\python27\lib\site-packages\numpy\lib\npyio.py", line 241, in __getitem__ return format.read_array(value) file "c:\python27\lib\site-packages\numpy\lib\format.py", line 459, in read_array array = numpy.fromstring(data, dtype=dtype, count=count) memoryerror
memoryerror
out of memory condition. explains why happens objects of @ least size - more , bigger arrays, expect, require more memory. max size is, , why seems have changed, harder. can highly specific system, in regard considerations like:
- how memory (physical ram , swap space) exists , available operating system
- how virtual memory os gives python
- how of you're using
- the implementation of c library, of
malloc
function, can affect how python uses memory allocated
and possibly quite few other things.
per comments, seems biggest problem here running 32 bit build of python. on windows, 32 bit processes apparently have effective maximum memory address space of around 2gb. tests, list of arrays using might take around quarter of that. fact error comes when reading file in suggests numpy deserialisation relatively memory intensive, don't know enough implementation able why be. in case, seems installing 64 bit build of python best bet.
Comments
Post a Comment