shweta misra

Greenhorn
+ Follow
since Jun 06, 2012
Merit badge: grant badges
For More
Cows and Likes
Cows
Total received
In last 30 days
0
Forums and Threads

Recent posts by shweta misra

I have to resolve a memory leak issue in a j2ee application running on weblogic application server.
Wanted some pointers on how to start working on this. what important aspects should i review.



11 years ago
I work in the telecom domain. We are designing a J2EE application which shall run on weblogic server.

At the client tier there will be a JMS queue which will take the incoming requests from an external application. This request will pass to an EJB Interceptor which will validate the request. It will then be passed to the Message driven bean. At the Business Tier , the MDB will call the local interfaces of a stateless session bean where all the business logic resides. The request will be converted into a sequence of commands by the business logic and will be passed to the Integration Tier , where the sequence of commands will be sent to an external system (A physical Network Element in the opeartor's telecom network where certain provisioning needs to be done). The sequence of commands will go to the network element through a socket connection. The external system will respond with a success/failure response which will travel back via a JMS queue to the calling appln

The new application will integrate with multiple such physical network elements where each network element has different IP/credentials and exposes a different interface and hence the sequence of commands varies everytime depending on which n/w element it needs to go to.

I am confused on which J2EE technology to use to implement in the Integration Tier. I have 2 solutions in mind

1) Use the Java Connector Architecture(JCA) technology. I selected this option because I read on J2EE design patterns and found that we should ideally use connectors in the integartion tier. But I'm not sure if this is a good or correct option . Moreover have never used this before so am learning it right now and not sure if its complex or simple.
2) Use new stateless Session Beans for every interface with which the application needs to integrate.

Please let me know which one may be a good option or if there is any other better solution.



We have an .ear application deployed on the weblogic server. In this application a piece of logic is being handled by a different work manager.
The following logic (given below) is written to create the work manager. We are using the commonj.work APIs
No work manager is created in the Weblogic Administrtaion console and we also have also not defined the Work Manager in weblogic-aplication.xml

This code was working fine and the performance was good previously.

Recently we added additional logic in the run() {} method. Since then the overall performance per request has deteriorated. We have a load of approx 200TPS coming to this application.
We feel this is because the work manager has not been defined properly hence the threads execution has slowed down.

Please give your comments if any.




public class XYZ
implements Work
{
private static WorkManager workmanager = WorkManagerFactory.getFactory().getWorkManager("WorkManager-PluginConnector");


public void run()
{
<logic is written here>
}

public void schuduleTask()
{
try
{
wi = workmanager.schedule(this);
}
catch (IllegalArgumentException e)
{
} catch (WorkException we) {
}
}
11 years ago
We are using Oracle 10g DB and no ... we are not using any triggers or process to modify this table.

btw we just found that the clock of one our DB instance was actually fast. so we have raised a request to bring it in sync. will then monitor the system.
hopefully this will resolve the issue. thanks for this pointer.


The message is logged after the insert statement is executed.

when we execute an insert statement , is the record immediately written into the db or is it put into some db queue from where its picked and then executed ?? in our case , the logger is printed after the insert statement. does this mean that the record should be written into the db by that time or is it possible that the query is accepted by the db but not processed
Hi Christophe,
We assumed the clocks should be the same because we see this issue intermittently. not for all the requests. I will however get the clocks checked.
Since auto commit is on by default we havent committed the transactions explicitly.


Hi Martin,
The log message actually looks something like this logger.info("CDR generated successfully for transcation id "+transcationId);
We compare the transactionid in the log and the one in the TRANSACTION_ID column of the table
We have a load of 400-500TPS and a record is written in the db for every successfull transaction. So this method is called for every successfull transaction. in our analysis so far , for every 50 successful transactions we see this issue in 7-8 transactions and the delay is always 3 min.
Hi,

Following is the snippet of the java code which inserts a record into a table.
The insert query insertCDRSQL2 is also given below. When this code is executed the observation is that the logger CDR generation successfully is printed at 11.31 pm in the log file but in the db the column TRANSACTION_DATETIME of the record inserted shows 11.34 pm. There is a 3 min delay in actually writing the record. This happens intermittently for some inserts.

I want to know what could be the problem here and why there could be a delay.




try {

conn = ocsgds.getConnection();
pre_stmt = conn.prepareStatement(insertCDRSQL2);


…..
pre_stmt.executeUpdate();
logger.info("CDR generation successfully");

}catch (Exception e) {
logger.error("Error in generating CDRs in the DB",e);
throw e;

}





INSERT INTO CUSTOM_CDR(TRANSACTION_ID,TRANSACTION_DATETIME, CIRCLE ......)
" VALUES(?,systimestamp,??........)

thanks Martin !

our application is deployed in the Weblogic application server and the transactions are container managed and the EJBs are stateless session beans. I found that for container managed transactions, in case of a system exception the transaction is rolled back . this will include db updates also.

as per our business logic we need to update certain db tables before we throw the exception but looks like its not possible.

Can you suggest any workaround ?
Our application (.ear) is deployed in the weblogic application server version 10.3.0

We are facing a thread stuck issue wherein an application thread gets stuck and the weblogic server goes into the WARNING state. The thread that gets stuck is actually
initiated by our application and it gets stuck because our application is waiting for a response to a request that it has sent to an external system. The thread stuck timeout value is 10 min and the response is not received before that leading to the thread getting stuck.

We want to achieve the following :-
Want to set some connection timeout parameter in the server say for 2 sec. When timeout period expires and no response is received some exception should be thrown automatically which should be caught by our application . This will help our application gracefully exit and avoid the thread stuck.

Please suggest where in the server we can find such a parameter.


(We found that in Tomcat server there is a connection-timeout parameter in the server.xml which helps serve the same thing we want to achieve. We want something similar in weblogic application server)
12 years ago
Hi,

We are facing a problem in our java application while inserting and updating data into the db.
The code for insertion/updation has been written in core java (jdk1.6).

The steps are as follows :-
1) code is written to insert/update data is 3 tables in the db
2) Soon after step 1, an exception is thrown

a snippet of the code is given below :-

dbentry.finalDBEntriesForSessionRelease(sessionId,"UPDATE", customCdrSession, "error_closed"); //update the db

throw ExceptionType.SERVICE_ERROR.createEx(); //throw exception.

We observed that the java code runs perfectly , the exception is also thrown as expected. The logs show that the db
insertion/updation is done . there are no errors.
But the problem is that the db insertion done in the 3 tables is not visible when we actually go and look into the tables. i.e
we are not able to see the new row that is inserted and the changed rows that are updated in the tables. its like as if nothing
happened. There are no errors also... code works perfectly.

Another observation is that the moment we remove the "throw Exception..." line, we are able to see the db changes in the tables.

Does anyone know why this is happening and how to resolve this ??