I have a lot of data in which is to be written to Database. It can be like an ar

I have a lot of data in which is to be written to Database. It can be like an array of 24,000 CDT variables. I want to write it to DB. Now, I have two approaches for this. I can A) Either Write to Database in batches - I can decide a batchSize of 200 and write 200 rows at a time sequentially. This approach will take some time to completely write to DB B) I can use MNI in process models. I will still use batchSize, but now more than one Write to DS smart service will execute simultaneously. This will increase the parallel writes to DB and will also complete the process model quickly.

My question: Will the parallel writes to DB using MNI affect the performance? Also I think it will open too many DB connections to database, so is it a good idea?

OriginalPostID-215023

OriginalPostID-215023

  Discussion posts and replies are publicly visible

Parents
  • @sikhivahans, Thank you. I went through those links. Your comment that 1000 records with 6 columns was identified as health risk by Appian health check gives an idea about how much data can be safely written at a time.
    So, I will go with approach of sequential writes to DB. I was only hoping that parallel writes may get the work done faster. But since it may cause issues, I would rather avoid it.
Reply
  • @sikhivahans, Thank you. I went through those links. Your comment that 1000 records with 6 columns was identified as health risk by Appian health check gives an idea about how much data can be safely written at a time.
    So, I will go with approach of sequential writes to DB. I was only hoping that parallel writes may get the work done faster. But since it may cause issues, I would rather avoid it.
Children
No Data