CyberSEO Pro plugin for WordPress

Python memory error

Read Time:10 Minute, 55 Second

Table of Contents

What exactly is a Memory Error?

Python Memory Error or, in layman’s terms, you’ve run out of Random access memory (RAM) to sustain the running of your code. This error indicates that you have loaded all of the data into memory. For large datasets, batch processing is advised. Instead of packing your complete dataset into memory, please save it to your hard disk and access it in batches.

Your software has run out of memory, resulting in a memory error. It indicates that your program generates an excessive number of items. You’ll need to check for parts of your algorithm that consume a lot of memory in your case.

A memory error occurs when an operation runs out of memory.

Python has a fallback exception, as do all programming languages, for when the interpreter runs out of memory and must abandon the current execution. Python issues a MemoryError in these (hopefully infrequent) cases, giving the script a chance to catch up and break free from the present memory dearth. However, because Python’s memory management architecture is based on the C language’s malloc() function, It is unlikely that all processes will recover – in some situations, a MemoryError will result in an unrecoverable crash.

A MemoryError usually signals a severe fault in the present application. A program that takes files or user data input, for example, may encounter MemoryErrors if it lacks proper sanity checks. Memory restrictions might cause problems in various situations, but we’ll stick with a simple allocation in local memory utilizing strings and arrays for our code example.


The computer architecture on which the executing system runs is the most crucial element in whether your applications are likely to incur MemoryErrors. Or, to be more particular, the architecture of the Python version you’re using. The maximum memory allocation granted to the Python process is meager if you’re running a 32-bit Python. The maximum memory allocation limit fluctuates and is dependent on your system. However, it is generally around 2 GB and never exceeds 4 GB.

64-bit Python versions, on the other hand, are essentially restricted only by the amount of memory available on your system. Thus, in practice, a 64-bit Python interpreter is unlikely to have memory problems, and if it does, the pain is much more severe because it would most likely affect the rest of the system.

To verify this, we’ll use psutil to get information about the running process, specifically the psutil.virtual memory() method, which returns current memory consumption statistics when called. The print() memory usage method prints the latter information:

Python Memory Errors There are Several Types of Python Memory Errors

In Python, an unexpected memory error occurs

Even if you have enough RAM, you could get an unexpected Python Memory Error, and you may be using a 32-bit Python installation.

Unexpected Python Memory Error: A Simple Solution

Your software has used up all of the virtual address space available to it. It’s most likely because you’re using a 32-bit Python version. Because 32-bit applications are limited to 2 GB of user-mode address space in Windows (and most other operating systems),


We Python Poolers recommend installing a 64-bit version of Python (if possible, update to Python 3 for various reasons); it will use more memory, but it will also have much more memory space available (and more physical RAM as well).

The problem is Python 32-bit only has 4GB of RAM. So it can be reduced due to operating system overhead even more if your operating system is 32-bit.

For example, the zip function in Python 2 accepts many iterables and produces a single tuple iterator. In any case, for looping, we only require each item from the iterator once. As a result, we don’t need to keep all of the things in memory while looping. As a result, it’s preferable to utilize izip, which retrieves each item only on subsequent cycles. Thus, by default, Python 3’s zip routines are called izip.

Memory Error in Python Because of the Dataset

Another choice, if you’re working with a huge dataset, is dataset size. The latter has already been mentioned concerning 32-bit and 64-bit versions. Loading a vast dataset into memory and running computations on it, and preserving intermediate results of such calculations can quickly consume memory. If this is the case, generator functions can be pretty helpful. Many major Python libraries, such as Keras and TensorFlow, include dedicated generator methods and classes.

Memory Error in Python Python was installed incorrectly

Improper Python package installation can also result in a Memory Error. In fact, before resolving the issue, we had manually installed python 2.7 and the programs that I need on Windows. We replaced everything using Conda after spending nearly two days attempting to figure out what was wrong, and the issue was resolved.


Conda is probably installing improved memory management packages, which is the main reason. So you might try installing Python Packages with Conda to see if that fixes the Memory Error.

Conda is a free and open-source package management and environment management system for Windows, Mac OS X, and Linux. Conda is a package manager that installs, runs, and updates packages and their dependencies in a matter of seconds.

Python Out of Memory Error

When an attempt to allocate a block of memory fails, most systems return an “Out of Memory” error, but the core cause of the problem rarely has anything to do with being “out of memory.” That’s because the memory manager on almost every modern operating system will gladly use your available hard disk space for storing memory pages that don’t fit in RAM. In addition, your computer can usually allocate memory until the disk fills up, which may result in a Python Out of Memory Error (or a swap limit is reached; in Windows, see System Properties > Performance Options > Advanced > Virtual memory).

To make matters worse, every current allocation in the program’s address space can result in “fragmentation,” which prevents further allocations by dividing available memory into chunks that are individually too small to satisfy a new allocation with a single contiguous block.

  • When operating on a 64bit version of Windows, a 32bit application with the LARGEADDRESSAWARE flag set has access to the entire 4GB of address space.
  • Four readers have contacted in to say that the gcAllowVeryLargeObjects setting removes the.NET restriction. No, it doesn’t. This setting permits objects to take up more than 2GB of memory, limiting the number of elements in a single-dimensional array to 231 entries.

In Python, how do I manually free memory?

If you’ve written a Python program that uses a large input file to generate a few million objects, and it’s eating up a lot of memory, what’s the best approach to tell Python that some of the data is no longer needed and may be freed?


This problem has a simple solution:

You can cause the garbage collector to release an unreferenced memory() by using gc.collect.
As illustrated in the example below:

import gc gc.collect()

Do you get a memory error when there are more than 50GB of free space in Python, and you’re using 64-bit Python?
On some operating systems, the amount of RAM that a single CPU can handle is limited. So, even if there is adequate RAM available, your single thread (=one core) will not be able to take it anymore. However, we are not certain that this applies to your Windows version.

How can you make python scripts use less memory?

Python uses garbage collection and built-in memory management to ensure that the application consumes as much memory as needed. So, unless you explicitly construct your program to balloon memory utilization, such as creating a RAM database, Python only utilizes what it requires.

Which begs the question of why you’d want to do it in the first place – consume more RAM in the first place. For most programmers, the goal is to use as few resources as possible.


If you wish to keep Python’s memory use low, virtual machine to a minimum, try this:

  • On Linux, use the ulimit command to set a memory limit for Python.
  • You can use the resource module to limit how much memory the program uses

Consider the following if you wish to speed up your software by giving it more memory: Multiprocessing, threading.
On only python 2.5, use pysco

How can I set memory and CPU usage limits?

To limit the amount of memory or CPU used by an application while it is running. So that we don’t have any memory problems. To accomplish so, the Resource module can be used, and both tasks can be completed successfully, as demonstrated in the code below:

Code 1: Limit CPU usage

# libraries being imported
import signal
import resource
import os # # confirm if there is an exceed in time limit
def exceeded_time(sig_number, frame): print("Time is finally up !") raise SystemExit(1) def maximum_runtime(count_seconds): # resource limit setup if_soft, if_hard = resource.getrlimit(resource.RLIMIT_CPU) resource.setrlimit(resource.RLIMIT_CPU, (count_seconds, if_hard)) signal.signal(signal.SIGXCPU, exceeded_time) # set a maximum running time of about 25 millisecond
if __name__ == '__main__': maximum_runtime(25) while True: pass

Code #2: To minimize memory usage, the code restricts the total address space available.


# using resource
import resource def limit_memory(maxsize): if_soft, if_hard = resource.getrlimit(resource.RLIMIT_AS) resource.setrlimit(resource.RLIMIT_AS, (maxsize, if_hard))

How to Deal with Python Memory Errors and Big Data Files

Increase the amount of memory available

A default memory setup may limit some Python tools or modules.

Check to see if your tool or library may be re-configured to allocate more RAM.

That is a platform built to handle massive datasets and allow data transformations. On top of that and machine learning algorithms will be applied.
Weka is a fantastic example of this, as you may increase memory as a parameter when running the app.

Use a Smaller Sample Size

Are you sure you require all of the data?

Take a random sample of your data, such as the first 5,000 or 100,000 rows, before fitting a final model to Use this smaller sample to work through your problem instead of all of your data (using progressive data loading techniques).


It is an excellent practice for machine learning in general, as it allows for quick spot-checks of algorithms and results turnaround.

You may also compare the amount of data utilized to fit one algorithm to the model skill in a sensitivity analysis. Perhaps you can use a natural point of declining returns as a guideline for the size of your smaller sample.

Make use of a computer that has more memory

Is it necessary for you to use your computer? Of course, – it is possible to lay your hand’s on a considerably larger PC with significantly more memory. A good example is renting computing time from a cloud provider like Amazon Web Services, which offers workstations with tens of gigabytes of RAM for less than a dollar per hour.

Make use of a database that is relational

Relational databases are a standard method of storing and retrieving massive datasets.

Internally, data is saved on a disk, loaded in batches, and searched using a standard query language (SQL).


Most (all?) programming languages and many machine learning tools can connect directly to relational databases, as can free open-source database solutions like MySQL or Postgres. You can also use SQLite, which is a lightweight database.

Use a big data platform to your advantage

In some cases, you may need to use a big data platform.

Error Monitoring Software

Airbrake’s powerful error monitoring software delivers real-time error monitoring and automatic exception reporting for all of your development projects. Airbrake’s cutting-edge web dashboard keeps you up to current on your application’s health and error rates at all times. Airbrake effortlessly interacts with all of the most common languages and frameworks, no matter what you’re working on. Furthermore, Airbrake makes it simple to adjust exception parameters while providing you complete control over the active error filter system, ensuring that only the most critical errors are collected.

Check out Airbrake’s error monitoring software for yourself and see why so many of the world’s greatest engineering teams rely on it to transform their exception handling techniques!



In this article, you learned about various strategies and methods for coping with Python Memory Error.

Would you mind letting us know in the comments section if you have used any of these methods?


How to Create Azure Resource Graph Explorer Scheduled Reports and Email Alerts Previous post How to Create Azure Resource Graph Explorer Scheduled Reports and Email Alerts
Next post Clean code tip: small functions bring smarter exceptions

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.