It might make sense from the POV of programming effort involved that you put a huge operation in one transaction. However, from how the database works, it;s not a good idea to put long running operations that update a lot of rows in a transaction.
1) Most databases generally keep a redo log of the currently open transactions. ALso, generally, the redo logs are kept in the more costlier storage areas, since most operations do frequent writes to redo logs. This could mean that the redo log is either in memory, or in one of the faster disks. It's never a good idea to force gobs of data into memory. For one, it will force other data out of buffers, and other transactions going on at the same time will take a hit
2) A transaction will lock records. Various database have differrent rules for locking. However,there are locks of some kind, and locks introduce bottlenecks in the system. The database uses locks to prevent the data from getting dirty. The side effect of a lock is other transactions trying to update the same record will be blocked. This introduces the bottleneck. Your system is not scalable.
IMO,
you should go back to having smaller transactions, and taking the hit of cleaning out dirty data afterwards. Transactions are meant for "small" operations where the consistency of data is important enough that you are willing to pay the price of introducing bottlenecks in the system
If I were you, I would also take a closer look at your design. What you have described here is a classic ETL job. You are running some huge operations to extract the data, transform it and spit out a "Template" (which I am assuming is some sort of file that either a human or an external system can read). Most ETL frameworks do not open transactions, and put write locks on the system exactly for the reasons mentioned above. Actually, most databases have utilities that can quickly extract data in CSV format. A lot of ETL frameworks leverage that by taking a quick extract out of the database, and then doing transformations on the extracted data. In most cases, you must avoid locking records that other transactions are going to be using for an extended period of time.
I'm not sure why you have the need to update the database when you are trying to extract data out of it. It's rather unusual