Appian Community
Site
Search
Sign In/Register
Site
Search
User
DISCUSS
LEARN
SUCCESS
SUPPORT
Documentation
AppMarket
More
Cancel
I'm looking for ...
State
Not Answered
+1
person also asked this
people also asked this
Replies
14 replies
Subscribers
6 subscribers
Views
5997 views
Users
0 members are here
Share
More
Cancel
Related Discussions
Home
»
Discussions
»
Process
I have a lot of data in which is to be written to Database. It can be like an ar
chetany
A Score Level 1
over 9 years ago
I have a lot of data in which is to be written to Database. It can be like an array of 24,000 CDT variables. I want to write it to DB. Now, I have two approaches for this. I can A) Either Write to Database in batches - I can decide a batchSize of 200 and write 200 rows at a time sequentially. This approach will take some time to completely write to DB B) I can use MNI in process models. I will still use batchSize, but now more than one Write to DS smart service will execute simultaneously. This will increase the parallel writes to DB and will also complete the process model quickly.
My question: Will the parallel writes to DB using MNI affect the performance? Also I think it will open too many DB connections to database, so is it a good idea?
OriginalPostID-215023
OriginalPostID-215023
Discussion posts and replies are publicly visible
Parents
0
Chris
over 9 years ago
Also for reference, we have found the Write to DS node to be very efficient - in our environment we have a nightly process that writes over 28,000 rows (27 columns per) via a single CDT / single Write to DS, each evening. This is run off hours, takes about 2 minutes to complete but has not caused any noticeable issues (except be sure to schedule around checkpointing). The process parses a CSV file, adds all rows to the CDT, performs updates on the CDT, then writes the entire CDT to the DB via one Write to DS node.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
Reply
0
Chris
over 9 years ago
Also for reference, we have found the Write to DS node to be very efficient - in our environment we have a nightly process that writes over 28,000 rows (27 columns per) via a single CDT / single Write to DS, each evening. This is run off hours, takes about 2 minutes to complete but has not caused any noticeable issues (except be sure to schedule around checkpointing). The process parses a CSV file, adds all rows to the CDT, performs updates on the CDT, then writes the entire CDT to the DB via one Write to DS node.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
Children
No Data