I'm presuming that "3Lac" means 3 lahks (300,000). I'm afraid that lakhs are not understood by most people outside of India, so please use more common units of measurement.
I'm also not sure what you mean by "unloaded", but my guess is that your ORM transactions are retrieving very large working sets and it's causing you memory issues, compounded by not having the memory released (which is what I think you mean by "unloaded") after you use it.
First of all, pulling 300K worth of records into memory is not something you'd generally want to do. If you're manipulating that many records in a single operation,
you should investigate ways to do it on the database server and not in a remote (
java) application. And I say that as someone who doesn't like the trivial use of stored procedures and other server-side vendor-dependent operations. There is a time and a place for such things, and this sounds like one of them. If you're not actually doing anything with all those records, you may need to set up for lazy fetching or some other means of fetch control. Pulling large amounts of data from a database and then not using it is a horrible waste of resources, not only memory, but CPU and network.
Beyond that, the ORM isn't responsible for releasing the records in the working set. The records are stored in Java objects and they will be garbage-collected according to the usual rules. First and foremost being that ONLY when there are no references remaining to the objects in question will they be eligible for garbage collection (and thus freeing up their memory). If you cannot find what references remain, use the Java diagnostic tools to find where the links are.