How to detect if loading an image will throw an OutOfMemory exception in .NET?

I have an application written using .NET 3.5 SP1 that downloads images from an external site and displays them to end users. On rare ocassions, my users are experiencing OutOfMemory errors because they are downloading enormous images. Sometimes the raw data associated with these images are large, but more often, the dimensions of the images are huge. I realize that I may never be able to get around the fact that these OOM errors are thrown for particular images. It would be VERY helpful, however, if I could somehow determine whether loading a particular image would lead to an OOM issue before I try to load the image.

The data for the images is loaded into a Stream, and then the image itself is turned into a System.Drawing.Image by making a call to System.Drawing.Image.FromStream(stream). I do not have the option of storing these images on disk first. They must be loaded via memory.

If anyone has any tips or suggestions that would allow me to detect that loading an image would lead to an OOM exception, I would very much appreciate it.

Answers


You can use MemoryFailPoint class to check for memory availability.

Best


OutOfMemory is one of those exceptions where you don't have a lot of good options. Anything that would predict conclusively that you're going to get the exception would probably have to just generate the exception.

I would say that your best bet is to profile the behaviour and create your own predictive ruleset or simply hard code maximum sizes into your application. It ain't pretty, but it'll get you there.


You might look at this question and see if it helps: How do I reliably get an image dimensions in .NET without loading the image?

The idea there would be to download only part of the total image (the header specifically) so you can read the metadata. Then you can use the information there to determine how big the image is, and deny it from completely downloading if you see it will be too big.

On the downside, it seems like you'd have to write a method to decompose the binary of each file type you want to be able to handle.


You've got a chicken-and-egg problem. To make some kind of guess, you need to know the size of the image. You don't know the size until you loaded it.

It isn't really helpful anyway. Whether you get OOM really depends on how fragmented the virtual memory address space has gotten. And that is not easy to find out in Windows. The HeapWalk() API function is required and that's an unhealthy function to use. Check out the small print in the MSDN library article for it. Especially bad in a managed program, don't use it.

Note that this OOM exception is not the same kind of OOM you'd get when you used up too much managed memory. It is actually a GDI+ exception and you can easily recover from it. Just catch the exception and display a "Sorry, couldn't do it" message.

If you do know the size of front somehow then you can pretty safely assume up front that width * height * 4 > 550 MB is not going to work in a 32-bit program. This limit goes down quickly after running for a while.


If you're downloading the images from an external site, and the external site sets the Content-Length HTTP header, you might be able to estimate if the image is going to fit in memory before you even start downloading the stream...


I have already agreed with @Vagaus answer, but I wanted to add that you should only allocate the buffer once and try to reuse it. If you are constantly allocating and releasing a large buffer you will definitely hit an OOM issue due to fragmentation on the heap.


Need Your Help

Has anyone gotten the LinkedIn Ruby Gem to work at all?

ruby-on-rails ruby api rubygems linkedin

The ruby gem for LinkedIn that I'm using is here: http://github.com/pengwynn/linkedin

Is it possible to use MacPorts to distribute custom software?

macports software-distribution

We have some internal software tools that I need to distribute to our machines. Checking out the source repository is an option, but that would require putting the binaries into the repo, which I a...