This is why using a batch process may be a good idea. Another options is to use scheduled job on PostgreSQL.
You may schedule your stored procedure to run every x minutes (or hours), and make it really process data only if some column in a "trigger job table" retains a given value. For example, if your stored reads a '0' value, simply returs, if reads '1' starts deleting and doing its job.
To make your stored run in background, your application may simply update to '1' the 'flag' column on the trigger job table.
The problem in this scenario is how to notify your application that data have been processed, supposing that this notification should happen automatically, and there's nothing like, let's say, a web page where an user go and gets results (if any).
nishu midha wrote:That's why I wanted To use stored procedure and after Triggering the stored procedure I wanted to close the connection so that quesry continue to run on postgresql database side and after completion it should write result to any table and. Later. I can collect data from that table. Is it possible?
Is there any possibility to execute this as a batch process?
We have done something similar: each night +- 1 million of records are inserted into new tables and then several stored procedures are executed to perform a bunch of business related tasks and conversions so our application will be able to work with this data. It takes 50 minutes to go through the whole process of importing the data and running the stored procedures. It's processed in batch and runs each night.