python - Read a single byte from large file -
i have function want read single byte large file. problem after amount of file reads memory on pc jumps steady 1.5gb 4gb , higher depending on how many file reads. (i break @ 80 files because higher crash pc)
all want 1 byte , not whole file. please.
def mem_test(): count = 0 dirpath, dnames, fnames in scandir.walk(restorepaths[0]): f in fnames: if 'dir.edb' in f or 'priv.edb' in f: f = open(os.path.join(dirpath, f), 'rb') f.read(1) #problem line! f.close() if count > 80: print 'exit' sys.exit(1) count += 1 mem_test()
to address memory issue, think you'd want use generator:
def mem_test(): dirpath, dnames, fnames in scandir.walk(restorepaths[0]): f in fnames: if 'dir.edb' in f or 'priv.edb' in f: open(os.path.join(dirpath, f), 'rb') f: yield f.read(1) #problem line! results = [x x in mem_test()]
Comments
Post a Comment