Best solution for a large database transaction

cobolrdbmsspringstored-procedurestransaction

I have a COBOL CICS process running on a mainframe legacy system. The process does over 2K DMLs in a highly concurrent DB environment. After each CRUD operation, the result is used to make further computations followed by CRUD operations again. This entire process runs under a transaction block such that any failure before its successful completion results in the entire set of DB changes to be rolled back, only on success of each and every step in the process till the end, there is a commit. The COBOL program is written in a manner that it doesn't consume more than 20 odd seconds for its completion or failure lest the other process starve for DB resource(locks, stalls etc.) due to the enormity of this transaction.

Now we are commissioned to expose this legacy application as a Java SOAP web service. The three approaches I have are:

  1. The Service layer of my Java code sends a message to the queue with input parameters, the already existing COBOL program picks the message from the queue and does the transaction. Java layer pings the queue asynchronously and when it has got a result from the COBOL process, it sends it back to the Controller.

  2. Write a Stored Procedure which mimics the job done by the COBOL process including the DB transaction and call that SP from Java Service layer.

  3. Write the Business logic inside Java Service/DAO layer and the transactions inside a Spring transaction block.

Business doesn't want the #1 approach as it means dependency on the legacy COBOL program. I have two doubts here :

  1. I have an understanding that SPs are better from performance point of view. But will the SP be able to handle 2K or more odd DMLs within a transaction block and do the job in time at least equivalent to the time which COBOL process takes ? If yes, then will it be efficient enough in locking-releasing DB resources as the DB is highly concurrent and I don't want other processes to starve due to it.

  2. I have never done a DB transaction of this magnitude from the Java code and I doubt whether Spring transaction block can hold 2K DMLs in a single block effectively. Even if it does I'm sure that it wouldn't match the speed of COBOL or SPs, hence it may acquire locks on records etc. and starve other processes which needs to access the same for long time.

The transactions are not on a particular DB table, it is distributed over 20 odd different tables containing critical financial data. I was thinking of breaking the large number of CRUD operations in chunks of Spring transaction blocks, but then it would have been a humongous task to code the entire roll back logic in Java, but it's alright, my main concern is locking other processes from DB access in the meanwhile.

I am a Java developer with meager knowledge of RDBMS and practically no knowledge of COBOL. I would be grateful if someone can help me point to the right direction so that I can come out with a solution myself.

Best Answer

I'd say 99 times out of 100 the answer is to grit your teeth and re-use the existing COBOL code. It sounds like it might have some fairly complex logic behind the scenes; unless you actually have an up-to-date and accurate spec you might have trouble re-producing the logic in all its nuances. This is bad mojo on a financial application as the bean counters will want bug-for-bug compatibility with the legacy code.

So, as a first stop, I would suggest that you seriously investigate what is needed to wrap your legacy process or integrate it with the front-end. You might take a look at the COBOL-Java integration options available from Micro Focus.

If you really can't re-cycle the existing code then you have a significant problem. While a sproc might well do what you want, you have the problem of making something mimic the exact behaviour of an apparently complex and probably poorly documented COBOL application.

Unless your replacement process is radically slower, wrapping 2000 DMLs in a transaction isn't going to be a major issue. It's possible to wrap entire ETL processes in a transaction and have them roll back on a failure. I have seen an application where it was necessary to do this, although it is unusual. Splitting a financial transaction is a no-no, especially on a double entry system. In almost all cases it should commit or roll back atomically. Stick with the single transaction.

The fact that the COBOL application does computations one DML at a time and uses the results in the next step of the computation does not bode well for conversion to set operations. Be prepared to have to do this row-by-row and find a way to make it run fast. A stored procedure may well be the way to do this. You don't say what DBMS you're using (DB/2 for Z/OS?).

Also, you might have to drop down to XA transactions (java.transactions API) and JDBC, especially if you need to work in the client layer. If you want good performance on 2000 row based operations you might have to go below the framework.

Related Topic