Win a copy of The Little Book of Impediments (e-book only) this week in the Agile and Other Processes forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Toplink causes memory leak?

 
Girish Vasmatkar
Ranch Hand
Posts: 201
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hello there..

I am getting Hava heap memory error on a select statement for large database table. The table contains approximately 50,000 records.
The entity that I ma fetching from the database has many fields.
I used jProbe to find the memory leak and found out that there were som many Object [] and char[] that were hanging in the heap memory. The were bound to some classes in Toplink.

Can it be avoided? I am getting the entities in chucnk so as to reindex the database entity.

Thanks and Regards,
Girish
 
Girish Vasmatkar
Ranch Hand
Posts: 201
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hello again,

Can someone please help me solving the issue? Can changing version number of toplink solve the issue?
 
James Ward
Ranch Hand
Posts: 263
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Your options:
1. Do not select all rows/data from the table. Select only what you need.
2. Turn off Caching.
 
James Sutherland
Ranch Hand
Posts: 553
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Do you run out of memory on single select from the table? Or are you loosing memory over time?

If it is the single select, then it is not a memory leak, you are just trying to read too much data for your JVM.
Either increase your heap size, or read less data.

In JPA you can chunk a query result using JPA Query setFirstResult and setMaxResult.
In TopLink/EclipseLink you can also use a ScrollableCursor.

If reading into the same EntityManager, ensure you clear it between pages otherwise you can still run out of memory. Or get a new EntityManager per page.
 
Girish Vasmatkar
Ranch Hand
Posts: 201
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
First of all, thanks for the reply
My database table has about 50,000 records and I am getting those records in chunk exactly the way you menyioned.



If it is the single select, then it is not a memory leak, you are just trying to read too much data for your JVM.
Either increase your heap size, or read less data.

In JPA you can chunk a query result using JPA Query setFirstResult and setMaxResult.
In TopLink/EclipseLink you can also use a ScrollableCursor.




But as I get more and more data for example first 100 then next 100 ...I see on the task manager the memory consumption increasing gradually....
and after about 23,000 records are fetched the JVM blows up and OutOfMemory exception occurs.
Right now I am using same EntityManager for the whole select and I will try to get the select now witha new EntityManager per page.


Thanks and Regards,
Girish

 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic