Is there any limit on number of html elements, that browser can display without problems?
Update: Delay becomes noticeable after a thousand of loaded rows. The speed of scroll itself is pretty bearable, but for example highlighting of the clicked row (with the help of single event handler on tbody) is painful (it takes at least 2-3 seconds and delay grows with the number of rows). I observe delay on all browsers. It's not only me, but almost everyone who visits the page, so I guess at some extent it affects every platform.
Update: I came up with simple example here: http://client.infinity-8.me/table.php?num=1000 (you can pass whatever number you want to num), basically it renders a table with num rows and has a single event handler attached to a parent table. I should conclude from this, that there actually is no noticeable dropdown in performance, caused by number of child elements. So it's probably a leak somewhere else :(
I don't think there is a limit defined by the standard. There might be a limit hard-coded in each browser implementation, though I would imagine that this limit is likely to be billions of elements. Another limit is the amount of addressable memory.
To solve your problem: As well as automatically loading elements as you scroll down, you could automatically unload the ones that have scrolled up off the screen. Then your program will remain fast even after scrolling a lot.
You may also want to consider an alternative interface such as paging.
Another thing you should look at is table sizing. If you have your table styled with table-width:auto; the browser has to measure every single element in the table to size it. This can get insanely slow.
Instead, choose a fixed width or at least style the table using table-width:fixed.
If you have got JS on each table row then old computers will not handle that. For HTML itself you shouldn't worry much.
You should worry about fact that normal human being doesn't like large tables this is what pagination is made for. Separate it using paging for better usability nor other concerns.
Think of book that doesn't have pages but one large page, would you like to read it? Even if your eyes (PC in our case) can handle it.
I don't think there is a limit. However, the longer a HTML file is, the more resources, your computer will need. But the table has to be very large then...
The limit is really determined by the user agent and client machine being used. HTML, in the same way as XML, is a tree format of data. Therefore the more elements, the further through the tree the client browser has to search to render the page.
I had issues adding more than 100 tables to a div (as an old workaround to IE6 not being able to create table elements dynamically).
The other thing is obviously your computer (RAM and CPU). But to be honest, most computers shouldn't have a problem with that unless we're talking 10000+ rows... and even then...
Can you post some code?
Given that there exists a multitude of browsers and rendering engines and all have different performance characteristics and also given that those engines get steadily improved in regard to performance and computer hardware gets faster all the time: No there is no fixed upper limit what a browser can handle. However, there are current upper limits on specific hardware for specific versions of browsers.
Nevermind RAM or CPU usage, what's the actual size of the file after it preloads?
If your table is really that huge, you could be forcing your users to download megabytes of data - I've found that some machines tend to hang after 2-4MB of data.
Depends. IE for example will not start rendering a table until all the content is loaded. So if you had a 5,000 row table it needs to load all 5,000 rows of data before rendering any of it where as other browsers start rendering once they have partial data and just adjust a bit (if needed) as the table grows.
Generally speaking rendering slows with the quantity of nodes but also with the complexity of nodes... Avoid nested tables, and if at all possible, try to break up huge tables into chunks... E.g. Every 100 rows (if possible)
There really isn't a reason in the world for publishing an entire huge dataset on a single page. If the requirement is to provide the user with all that data, then you should export it to a file that can be read by some better software than a browser.
Instead, I suggest that you make an AJAX driven page, where you let the user see a portion of the data and if they need to see more, you would just download that portion of the dataset and replace the current dataset on the page. This is pagination. Google search is an excellent example of this.
If there are any limits, it depends on the browser. But the problem you have it not about limit, since the browser still displays the page.
Big tables are always problem with browsers. Rendering a large table takes lots of time. Therefore, it is often good idea to split a large table into smaller tables.
Further, you probably want to specify the column widths. Without that, the browser has to download the whole table before it can calculate width of each column and render the table. If you specify the widths in the HTML code, the browser can display the page while it is still downloading. (Note: specifying the width of the whole table is not enough, you need to specify the width of each column.)
But the most effective method is to split the data into multiple pages. The users probably prefer that, too. That is why for example Google displays only so many results on each page.