How to mitigate against excess memory usage in browsers when asynchronously loading data?

I am supporting an HTML/JS browser-based app that asynchronously loads in data (SVGs) as the user browses around in the app. (The app is an ebook reader, but the use-case is similar to an online mapping application, for example).

It is theoretically possible for the user to request more data than they have RAM available, if they spend a long time using the app, as the potential amount of data that can be downloaded in total is large (GBs, but only 10s of K per request, i.e. per SVG)

I have had reports of some users experiencing slowdowns, browser hangs etc after extended periods of usage. There is no consistency in browser or OS.

This leads to a few related questions:

  1. Are there any "good practices" with this kind of application, to somehow remove older or less-often-used data from memory, within one session? How exactly (in JS) would this be done? Is it sufficient to remove an element from the DOM that contains an SVG, for the memory used by that SVG to be released? Is this even necessary?

  2. What exactly happens to the main browsers (Chrome, FF, IE8/9/10...) when the amount of data asynchronously requested exceeds the memory available? Is it just a case of hard-disk paging?

  3. Are there any tests that can be done within Javascript to know when "too much data" has been reached? e.g. my development rig has a large amount of RAM and so I do not notice this problem, yet on test rigs (and some user machines) there is much less RAM and the problem is found sooner (but not every time, and not easily repeatable, irritatingly).


JavaScript memory problems often have little to do with async data and more to do with proper cleanup.

Even though JavaScript is garbage collected, you still can create memory leaks. Memory is divided between the DOM and JavaScript. If anything from one 'side' is referencing the other, it can't be GC'd until it's completely freed of all cyclic references between the two.

A practical example of this: If any event listener is attached to a DOM element, the DOM now has a reference to JS callback. If the same callback also has a reference to the same DOM object stored in a variable, it will prevent the related DOM tree AND any memory/closures from the JavaScript callback from being GC'd. In this scenario, the page will leak even if the DOM element is removed from the tree and you have no references to the JavaScript callback!

One of the reasons jQuery uses the $() function is to wrap DOM objects references avoid users from having direct references to the DOM objects in your code. It does not mean you won't create leaks if you aren't looking out. Because of the wrapping, you can create leaks if you don't always go through the wrapper to do DOM manipulation/cleanup.

With all the different ways you can create leaks, it's best to use a tool to diagnose if memory leaks are an issue for you. Chrome dev tools allows you to profile your application and see how the memory behaves/is being used. It can even tell you what references aren't being cleaned up after GC.

If you find there aren't memory leaks and your application memory footprint is too big, there are a few techniques you can do; lazy load content, reuse existing DOM by cloning a template, use a single event callback to catch events as they bubble instead of attaching to all the children, etc.

Need Your Help

How do I configure a project to use latest Derby DB version (10.4)?

java netbeans derby

Every time I try to run a small application that uses a Derby DB I get this error message: