Win a copy of TDD for a Shopping Website LiveProject this week in the Testing forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Paul Clapham
  • Ron McLeod
  • Jeanne Boyarsky
  • Tim Cooke
Sheriffs:
  • Liutauras Vilda
  • paul wheaton
  • Henry Wong
Saloon Keepers:
  • Tim Moores
  • Tim Holloway
  • Stephan van Hulst
  • Carey Brown
  • Frits Walraven
Bartenders:
  • Piet Souris
  • Himai Minh

How to reduce heap space used by an application in the specified scenario.

 
Greenhorn
Posts: 3
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I am working on an application where I have to load data from a file. The data is stored in (index, value) pair for different types of entities.
file content eg.:
-----------------------------------------------------
[Entity1]
1=Module1, 2=Module2,....360=Module360.
[Entity2]
1=Module1, 2=Module2,....360=Module360.
.
.
.[Entity30]
1=Module1, 2=Module2,....360=Module360.
-----------------------------------------------------
Currenty, the data is read and loaded in HashMap's where we have one map corresponding to each entity. So, we have 30 HashMap's with size 360 each which would consume a lot of heap memory.
Also, the number of entities/ modules could increase depending on the requirements.

Could there be a way where I can avoid using so much of heap space (The application is to be deployed in a production server where it will be doing many other jobs at the same time).

Thanks in advance.
 
Bartender
Posts: 6109
6
Android IntelliJ IDE Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Kalpit Deewan wrote:I am working on an application where I have to load data from a file. The data is stored in (index, value) pair for different types of entities.
file content eg.:
-----------------------------------------------------
[Entity1]
1=Module1, 2=Module2,....360=Module360.
[Entity2]
1=Module1, 2=Module2,....360=Module360.
.
.
.[Entity30]
1=Module1, 2=Module2,....360=Module360.
-----------------------------------------------------
Currenty, the data is read and loaded in HashMap's where we have one map corresponding to each entity. So, we have 30 HashMap's with size 360 each which would consume a lot of heap memory.



Are you assuming that? Or have you actually measured and found that it's consuming so much as to be a problem? Unless the key or value objects are very large, 30 * 360 = 10,800 entries is not much. Even if each entry is 1 kB, that's only about 10 MB of memory--not much to worry about in a desktop or server environment.

Could there be a way where I can avoid using so much of heap space



Either make the objects smaller or store fewer of them at once. You might look into a caching mechanism, or some kind of compression, or redesigning your classes so that you only store the bare minimum in the map and create the rest on the fly. Impossible to give more concrete advice without knowing more details.

Only do that, however, if you have actually measured and found unacceptable memory use that you can be sure is due to these maps. Just assuming they're going to "use lots of memory" is premature optimization and has a very high probability of being a total waste of time, and of adding additional complexity to your code for no real gain.
 
Kalpit Deewan
Greenhorn
Posts: 3
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Thanks Jeff for looking into it and giving a nice explanation.

have you actually measured and found that it's consuming so much as to be a problem?



Yes, we have seen the application throwing OOM error if it runs for long time. We are debugging on what classes are not getting GC and what else can be done to reduce the heap space consumed by the application.

Yes, you are right that it should not be a concern since its not consuming much of the heap space. However, I could see that there are many configuration files that are being read and kept in the run time memory for the duration application runs (As they are stored in HashMap, they would never be GC).

I am just looking for a better caching mechanism which could be used here.

Also, there is one more issue:

We have identified a problem with Apache log4j library used in our application - 1.2 Million instances of log4j Logger object were seen that seems too many!

On debugging, we found that there is a HashMap <String, LoggerObjectInstance> - " looks the Culprit for 1.2M instances", when we need a logger object, first check if it's in the HashMap. If the required object isn't there, create a new instance of it and put it into the HashMap. This way, the next time we need it, it's right where we left it.

The tricky part is the cleanup. At some point, the application needs to go through the HashMap and clean up the objects that haven't been used for a while and are wasting memory.

Where do we put this cleanup code? How often do we call it? These are problematic issues.

Would using WeakHashMap solve it?


 
Jeff Verdegan
Bartender
Posts: 6109
6
Android IntelliJ IDE Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Kalpit Deewan wrote:Thanks Jeff for looking into it and giving a nice explanation.

have you actually measured and found that it's consuming so much as to be a problem?



Yes, we have seen the application throwing OOM error if it runs for long time. We are debugging on what classes are not getting GC and what else can be done to reduce the heap space consumed by the application.



Then you don't necessarily need to reduce the amount of memory used by these objects and operations on them. It may be that you're just hanging on to stuff that you no longer need, such as forgetting to remove something from a List or Map when you're done with it.


Also, there is one more issue:

We have identified a problem with Apache log4j library used in our application - 1.2 Million instances of log4j Logger object were seen that seems too many!

On debugging, we found that there is a HashMap <String, LoggerObjectInstance> - " looks the Culprit for 1.2M instances", when we need a logger object, first check if it's in the HashMap. If the required object isn't there, create a new instance of it and put it into the HashMap. This way, the next time we need it, it's right where we left it.

The tricky part is the cleanup. At some point, the application needs to go through the HashMap and clean up the objects that haven't been used for a while and are wasting memory.

Where do we put this cleanup code? How often do we call it? These are problematic issues.

Would using WeakHashMap solve it?



Why are you even using that Map? Logger.getLogger() should do that for you. Why do you have a million different loggers? What are you using for their names?

A WeakHashMap should work. Or you could use a LinkedHashMap in LRU mode to remove entries that haven't been used for a while. If you end up having to re-create them later, so what? Or just periodically clear the whole map.
 
bacon. tiny ad:
free, earth-friendly heat - a kickstarter for putting coin in your pocket while saving the earth
https://coderanch.com/t/751654/free-earth-friendly-heat-kickstarter
reply
    Bookmark Topic Watch Topic
  • New Topic