Discussion:
pickle: maximum recursion depth exceeded
Ulrich Petri
2003-11-03 23:46:40 UTC
Permalink
"Anthony Briggs" <abriggs at westnet.com.au> schrieb im Newsbeitrag
However, it works when i try the smallest examples and use
sys.setrecursionlimit(4000)
This seems like a limitation in the pickling code. Yes?
I would suspect that you have a loop in your definitions, eg. A
imports B, and B imports A, particularly since you're trying small
examples, and they're still exceeding the recursion depth.
Unlikely, since it works if he sets recursionlimit(4000).
As others put it:
"our new model of Cray is fast enough to finilize an infinite loop in few
nanosecs"

Ciao Ulrich
Simon Burton
2003-11-04 01:34:40 UTC
Permalink
Here is a simple example that breaks pickle.
(With N=256 all is well.)
This probably should go in the docs for pickle -
that highly interlinked data cannot be pickled.
I know that in the past we have been told not to
do this, because of GC issues, so I understand
if it's considered too pathalogical for now.

Simon.

#!/usr/bin/env python

#import cPickle as pickle
import pickle
import os
#sys.setrecursionlimit(4000)

N = 512
print "building..."
nest = [ [] for i in range(N) ]
for i in range(N):
for j in range(N):
nest[i].append( nest[j] )

print "dumping..."
file = open("nest.pkl","wb")
try:
pickle.dump( nest, file )
except RuntimeError, e:
print e
Simon Burton
2003-11-03 00:29:14 UTC
Permalink
Hi,

I am pickling big graphs of data and running into this problem:

[...]
File "/usr/lib/python2.2/pickle.py", line 225, in save
f(self, object)
File "/usr/lib/python2.2/pickle.py", line 414, in save_list
save(element)
File "/usr/lib/python2.2/pickle.py", line 219, in save
self.save_reduce(callable, arg_tup, state)
File "/usr/lib/python2.2/pickle.py", line 249, in save_reduce
save(state)
File "/usr/lib/python2.2/pickle.py", line 225, in save
f(self, object)
File "/usr/lib/python2.2/pickle.py", line 447, in save_dict
save(value)
File "/usr/lib/python2.2/pickle.py", line 219, in save
self.save_reduce(callable, arg_tup, state)
File "/usr/lib/python2.2/pickle.py", line 245, in save_reduce
save(arg_tup)
File "/usr/lib/python2.2/pickle.py", line 225, in save
f(self, object)
File "/usr/lib/python2.2/pickle.py", line 374, in save_tuple
save(element)
File "/usr/lib/python2.2/pickle.py", line 225, in save
f(self, object)
File "/usr/lib/python2.2/pickle.py", line 405, in save_list
write(self.put(memo_len))
RuntimeError: maximum recursion depth exceeded

However, it works when i try the smallest examples and use
sys.setrecursionlimit(4000)

This seems like a limitation in the pickling code. Yes?

The data is perhaps better off in some kind of DB designed for
massively interconnected objects. Any suggestions? ZODB ?

BTW, the data is path searching info for a game, and takes 1-2Mb of memory.

Thankyou,

Simon Burton.
Anthony Briggs
2003-11-03 10:15:02 UTC
Permalink
Post by Simon Burton
Hi,
[...]
File "/usr/lib/python2.2/pickle.py", line 225, in save
f(self, object)
File "/usr/lib/python2.2/pickle.py", line 414, in save_list
save(element)
...
Post by Simon Burton
File "/usr/lib/python2.2/pickle.py", line 225, in save
f(self, object)
File "/usr/lib/python2.2/pickle.py", line 405, in save_list
write(self.put(memo_len))
RuntimeError: maximum recursion depth exceeded
However, it works when i try the smallest examples and use
sys.setrecursionlimit(4000)
This seems like a limitation in the pickling code. Yes?
I would suspect that you have a loop in your definitions, eg. A
imports B, and B imports A, particularly since you're trying small
examples, and they're still exceeding the recursion depth.

Hope that helps,

Anthony
--
----------------------------------------------------
HyPEraCtiVE? HeY, WhO aRE YoU cALliNg HypERaCtIve?!
aBRiGgS at wEStNeT.cOm.aU
----------------------------------------------------
Loading...