AnushAmit Malhotra wrote:
(1) Are there memory leaks in OJDBC14 w.r.t t4cpreparedstatement which are sorted out on OJDBC7?
Looking at Oracle Support
Doc ID 1362437.1, it says:
T4CPreparedStatements are creating a large number of char array objects while caching of statements is taking place.
This generally happens if the statement-cache-size of the datasources targeted to the server is very high.
The default value of statement-cache-size is 10 but when it is changed to higher values like 100 with multiple datasources targeted on the same server, many objects of type T4CPreparedStatements are created in heap which might lead to an OOME.
Reduce the statement-cache-size to resolve the issue.
Note that the prepared statement cache size is actually the capacity of the connection pool multiplied by the pool Statement Cache Size. If the pool has 10 connections and the Statement Cache Size is 10, there are actually 100 prepared statements in the pool, each taking up resources (
see here).
AnushAmit Malhotra wrote:
(2) When we have a piece of code as below
try{
for (loop runs for 100 interations){
ps = conn.prepareStatement(sql);
rs = ps.executeQuery();
}finally{
ps.close();
rs.close()
}
and do close ps & rs in a finally block, do the hanging references of ps & rs (ones which are not closed in the for loop) remain with the connection object? What happens when the connection object goes back to the Weblogic Connection Pool?
Does it carry this state or is it cleaned up?
Looking at Doc ID 1141024.1, it says:
The most common cause of this problem is that JDBC objects are not properly closed. Use the result from the third query in the Diagnostic Queries to trace back in application code to make sure that all JDBC objects are properly closed. Oracle recommends that you explicitly close JDBC objects, such as Connections, Statements, and ResultSets, in a finally block to make sure that all JDBC objects are closed under normal or exception condition...
Please avoid any code practice that abandons JDBC objects.
Although Statements and ResultSets should be closed when a Connection is closed, per JDBC specification, it's a good practice to explicitly close Statements and ResultSets right after you finish using them if you create multiple Statements on one Connection object. If you don't explicitly close Statements and ResultSets right away, cursors may accumulate and exceed the maximum number allowed in your DB before the Connection is closed.
This document is specifically in reference to an Oracle database reporting "maximum open cursors exceeded". Is this the error you are running into?