This week's giveaway is in the JDBC forum.
We're giving away four copies of Java Database Connections & Transactions (e-book only) and have Marco Behler on-line!
See this thread for details.
Win a copy of Java Database Connections & Transactions (e-book only) this week in the JDBC forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Devaka Cooray
  • Knute Snortum
  • Paul Clapham
  • Tim Cooke
Sheriffs:
  • Liutauras Vilda
  • Jeanne Boyarsky
  • Bear Bibeault
Saloon Keepers:
  • Tim Moores
  • Stephan van Hulst
  • Ron McLeod
  • Piet Souris
  • Frits Walraven
Bartenders:
  • Ganesh Patekar
  • Tim Holloway
  • salvin francis

Issue: OutofMemory error with simple application  RSS feed

 
Greenhorn
Posts: 27
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi, i have a java application that reads a log file, parses the each log and inserts into DB.(mysql)
the issue happens when we try to process more than 3.5Lakh records. It exits.
I have tried nullifying all the applicable objects (String, ArrayList etc) once its used. and also used explicitly System.gc()
Appreciate if some one can help me on that.
 
Ranch Hand
Posts: 650
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Pravin Jagan wrote:Hi, i have a java application that reads a log file, parses the each log and inserts into DB.(mysql)
the issue happens when we try to process more than 3.5Lakh records. It exits.
I have tried nullifying all the applicable objects (String, ArrayList etc) once its used. and also used explicitly System.gc()
Appreciate if some one can help me on that.



What does "3.5Lakh" mean? I'll assume it means a lot of records.

You're not showing any code, so it's real hard to see what you might be doing wrong.

Gazing into my crystal ball, I can offer this:

Have a look at the methods available on each of your JDBC objects (like statements, result sets, etc.). If they have close() methods, then you probably need to call close() when you're done with them. Start with the ones inside your "for each record" loop, but don't stop there. You should learn good object management early and keep using it forever.

 
Sheriff
Posts: 21747
102
Chrome Eclipse IDE Java Spring Ubuntu VI Editor Windows
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Mark E Hansen wrote:What does "3.5Lakh" mean? I'll assume it means a lot of records.


It sure does!

I agree on the closing of ResultSet, (Prepared)Statement and Connection objects once they are no longer needed. Furthermore, are you storing each line in memory before processing them all? If so, don't. Process them one by one, possibly using (Prepared)Statement's batch methods. Keep your batches relatively small (<1000); if they are too large you will again get memory problems.

In pseudo code:
 
It's hard to fight evil. The little things, like a nice sandwich, really helps. Right tiny ad?
how do I do my own kindle-like thing - without amazon
https://coderanch.com/t/711421/engineering/kindle-amazon
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!