I have a lot of data in which is to be written to Database. It can be like an ar

I have a lot of data in which is to be written to Database. It can be like an array of 24,000 CDT variables. I want to write it to DB. Now, I have two approaches for this. I can A) Either Write to Database in batches - I can decide a batchSize of 200 and write 200 rows at a time sequentially. This approach will take some time to completely write to DB B) I can use MNI in process models. I will still use batchSize, but now more than one Write to DS smart service will execute simultaneously. This will increase the parallel writes to DB and will also complete the process model quickly.

My question: Will the parallel writes to DB using MNI affect the performance? Also I think it will open too many DB connections to database, so is it a good idea?

OriginalPostID-215023

OriginalPostID-215023

  Discussion posts and replies are publicly visible

Parents
  • Also for reference, we have found the Write to DS node to be very efficient - in our environment we have a nightly process that writes over 28,000 rows (27 columns per) via a single CDT / single Write to DS, each evening. This is run off hours, takes about 2 minutes to complete but has not caused any noticeable issues (except be sure to schedule around checkpointing). The process parses a CSV file, adds all rows to the CDT, performs updates on the CDT, then writes the entire CDT to the DB via one Write to DS node.
Reply
  • Also for reference, we have found the Write to DS node to be very efficient - in our environment we have a nightly process that writes over 28,000 rows (27 columns per) via a single CDT / single Write to DS, each evening. This is run off hours, takes about 2 minutes to complete but has not caused any noticeable issues (except be sure to schedule around checkpointing). The process parses a CSV file, adds all rows to the CDT, performs updates on the CDT, then writes the entire CDT to the DB via one Write to DS node.
Children
No Data