ICode9

精准搜索请尝试: 精确搜索
首页 > 系统相关> 文章详细

《Python》np.fromfile内存限制

2021-07-10 19:00:29  阅读:277  来源: 互联网

标签:sheet Python share 2048 np fromfile data bit se


https://stackoverflow.com/questions/54545228/is-there-a-memory-limit-on-np-fromfile-method

由于安装32bit python导致的问题

解决方案:安装64bit python

Is there a memory limit on np.fromfile() method?

1

I am trying to read a big file into array with the help of np.fromfile(), however, after certain number of bytes it gives MemoryError.

with open(filename,'r') as file:
    data = np.fromfile(file, dtype=np.uint16, count=2048*2048*63)
    data = data.reshape(63, 2048, 2048)

It works fine with 2048*2048*63 however not working with 2048*2048*64. How to debug this? I am wondering what is the bottleneck here?

Edit: I am running on Windows 10, RAM 256GB, it is a standalone script, 64bit Python.

Edit2: I followed the advices on comments, now getting the error at 128*2048*2048, works fine with 127*2048*2048.

Share Improve this question   edited Feb 6 '19 at 1:07     asked Feb 6 '19 at 0:50 CanCode 9522 silver badges99 bronze badges
  •   How much RAM do you have and what is the bittage of your system? – Mad Physicist Feb 6 '19 at 0:52
  • 1 How much RAM do you have? How large is your swap/pagefile? Is this standalone code, or part of a larger program with other large allocations? Are you running a 32 or 64 bit version of Python? We need a lot more details to provide useful answers. – ShadowRanger Feb 6 '19 at 0:54
  •   256GB RAM, but python cannot handle that? – CanCode Feb 6 '19 at 0:54
  • 2 2048 * 2048 * 2 * 64 = 0.5 GiB. Shouldn't be an issue. Unless you do it a few thousand times – Mad Physicist Feb 6 '19 at 0:55 
  • 1 For the record, on 64 bit CPython 3.7.1 running on Ubuntu bash on Windows, I can't reproduce (and I have far less RAM than you, only 12 GB). It loads and reshapes just fine (if I change it to data.reshape from np.reshape). While it's unlikely to matter, it would be useful to know what OS and specific Python version you're running. – ShadowRanger Feb 6 '19 at 0:59 
Show 15 more comments

1 Answer

ActiveOldestVotes 1  

Despite what you believe, you've installed a 32 bit version of Python on your 64 bit operating system, which means virtual address space is limited to only have 2 GB in user mode, and attempts to allocate contiguous blocks of a GB or more can easily fail due to address space fragmentation.

The giveaway is your sys.maxsize, which is just the largest value representable by a C ssize_t in your build of Python. 2147483647 corresponds to 2**31 - 1, which is the expected value on 32 bit Python. A 64 bit build would report 9223372036854775807 (2**63 - 1).

Uninstall the 32 bit version of Python and download/install a 64 bit version (link is to 3.7.2 download page) (look for the installer to be labelled as x86-64, not x86; the file name would include amd64). Annoyingly, the main page for downloading Python defaults to offering the 32 bit version for Windows, so you have to scroll down to the links to specific version download pages, click on the latest, then scroll down to the complete list by OS and bittedness and choose appropriately.

标签:sheet,Python,share,2048,np,fromfile,data,bit,se
来源: https://www.cnblogs.com/focus-z/p/14994539.html

本站声明: 1. iCode9 技术分享网(下文简称本站)提供的所有内容,仅供技术学习、探讨和分享;
2. 关于本站的所有留言、评论、转载及引用,纯属内容发起人的个人观点,与本站观点和立场无关;
3. 关于本站的所有言论和文字,纯属内容发起人的个人观点,与本站观点和立场无关;
4. 本站文章均是网友提供,不完全保证技术分享内容的完整性、准确性、时效性、风险性和版权归属;如您发现该文章侵犯了您的权益,可联系我们第一时间进行删除;
5. 本站为非盈利性的个人网站,所有内容不会用来进行牟利,也不会利用任何形式的广告来间接获益,纯粹是为了广大技术爱好者提供技术内容和技术思想的分享性交流网站。

专注分享技术,共同学习,共同进步。侵权联系[81616952@qq.com]

Copyright (C)ICode9.com, All Rights Reserved.

ICode9版权所有