I have a lot of data in which is to be written to Database. It can be like an ar

I have a lot of data in which is to be written to Database. It can be like an array of 24,000 CDT variables. I want to write it to DB. Now, I have two approaches for this. I can A) Either Write to Database in batches - I can decide a batchSize of 200 and write 200 rows at a time sequentially. This approach will take some time to completely write to DB B) I can use MNI in process models. I will still use batchSize, but now more than one Write to DS smart service will execute simultaneously. This will increase the parallel writes to DB and will also complete the process model quickly.

My question: Will the parallel writes to DB using MNI affect the performance? Also I think it will open too many DB connections to database, so is it a good idea?

OriginalPostID-215023

OriginalPostID-215023

  Discussion posts and replies are publicly visible

Parents
  • You must consider the growth of the data in the foreseeable future and design your solution such a way that it handles the increase in the data volume efficiently and continues to function without break. It's always desirable to achieve a particluar functionality leveraging Appian OOTB (exactly what your solution does) as long as it's possible and adopting this OOTB approach could be the best option in your case , but you must consider all aspects of your solution (impact of future data growth,making multiple DB calls, performance etc..) before finalizing it. I'll recommend for you to go through this link if you have not already done so - forum.appian.com/.../Transferring_Processing_Large_Data_Sets_(ETL).html.
Reply
  • You must consider the growth of the data in the foreseeable future and design your solution such a way that it handles the increase in the data volume efficiently and continues to function without break. It's always desirable to achieve a particluar functionality leveraging Appian OOTB (exactly what your solution does) as long as it's possible and adopting this OOTB approach could be the best option in your case , but you must consider all aspects of your solution (impact of future data growth,making multiple DB calls, performance etc..) before finalizing it. I'll recommend for you to go through this link if you have not already done so - forum.appian.com/.../Transferring_Processing_Large_Data_Sets_(ETL).html.
Children
No Data