Win a copy of Functional Reactive Programming this week in the Other Languages forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Out of Memory Exception

 
Sreenivasulu Naidu
Greenhorn
Posts: 4
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,

My application requires to create around 39,000,000 objects and i use ArrayList to hold these objects. I get
OutOfMemoryError approx after creation of 1million objects and added. Here is my piece of code. Any thoughts.

ArrayList<SABAccessRuleDataBean> explodedList = new ArrayList<SABAccessRuleDataBean>();
for(MSSSiteData site : siteList){
MSSSiteData site = siteList.get(i);
for(MSSServiceData svc : svcList){
for(MSSCalendarData cal : calList){
sabARObj = new SABAccessRuleDataBean();
sabARObj.accessType = accessType_p;
sabARObj.authSite_id = site.siteID;
sabARObj.authSite_code = site.siteCode;
sabARObj.authSite_desc = site.siteDesc;

sabARObj.service_id = svc.svcID;
sabARObj.service_code = svc.svcCode;
sabARObj.service_desc = svc.svcDesc;

sabARObj.calendar_id = cal.calID;
sabARObj.calendar_code = cal.calCode;
sabARObj.calendar_desc = cal.resourceObj.desc;

explodedList.add(sabARObj);
}
 
Jeanne Boyarsky
author & internet detective
Marshal
Posts: 34973
379
Eclipse IDE Java VI Editor
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Sreenivasulu,
A million records is a lot of memory. What are you trying to do with the objects - write to a file, some sort of processing, etc? Is it possible to do it in batches?
 
Kees Jan Koster
JavaMonitor Support
Rancher
Posts: 251
5
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
So basically, you are trying to re-implement a database in Java. I'd suggest you put those 39M records in a database and then use SQL to query them. Usually, SQL is *way* faster than Java at searching through large sets of data.
 
Tim Holloway
Saloon Keeper
Posts: 18302
56
Android Eclipse IDE Linux
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Kees Jan Koster wrote:Usually, SQL is *way* faster than Java at searching through large sets of data.


I'd be reluctant to make that assertion. For one thing, it depends on what you mean by "Java". For that matter, it depends on what you means by "SQL". And there are some memory-resident SQL DBMS's, although they're not intended to store very large amounts of data.

The critical performance determinant is the size of the working set. That is, how much data has to be together in memory at the same time. Most commonly, when working with a large data set, you'll be iterating the data set proper, but you may need information from side tables which are smaller, but accessed randomly. So you might make those bits of data memory-resident and pass through the main data set stream-wise. That keeps the overall memory requirements down, saving resources and ensuring you don't end up compounding the situation courtesy of too much virtual memory paging (a/k/a "thrashing").

ORM's are a really good solution to things like that, since they keep you from having to invent the various caching mechanisms from scratch and can be fine-tuned by altering declarative information instead of altering (and debugging!) custom code.
 
Kees Jan Koster
JavaMonitor Support
Rancher
Posts: 251
5
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Dear Tim,

Ok, I was painting overly broad. To make my statement more precise: what I see happening with new Java devs that start working with SQL is that they basically do "SELECT * FROM table" and then iterate over the resulting set to do something useful. By moving the set operations into the SQL domain (e.g. SELECT COUNT(*) FROM table WHERE condition) instead of iterating in Java, performance is gained in all but edge cases.

Another nice one is that I get a list using one JDBC query and then get details data for each item in the list using an SQL statement for each item. Runs nicely for small lists but does not scale all that well.

Kees Jan
 
steve souza
Ranch Hand
Posts: 862
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Although there may be exceptions it sounds like a questionable design to put 39 million of these objects in memory. What if this number grows? At some point you will probably need to handle the situation where you have more objects than can be held in memory. Of course it is hard to make any comments at all when we don't know what you are trying to accomplish.
 
Hemanth H Bhat
Greenhorn
Posts: 15
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I am not sure about what you are trying to achieve but you can give a try by increasing the memory allocated to JVM on which you are running this application using

-Xms and -Xmx as arguments

 
Shariquddin Mohammed
Greenhorn
Posts: 9
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
My suggestion is use the database, querry using a prepared statement , fetch chunks of "relevant" data in batches and then use a collections object on the App Layer.
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic