Hi All,
I am having nearly 45000 rows of data with 11 columns in CDT.Need to upload data to Appian DB without affecting performance.Kindly provide your suggestions for splitting the data into batches say 2000 rows each time and then upload data to DB.
Thanks in Advance.
Pradeep.B
Discussion posts and replies are publicly visible
This largely depends on where your 45000 rows of data are currently located. Is it in some running process already? Or in an excel file? Or something else?
First the data will be provided in notepad file,then i will convert and map the data into cdt using expression rules.
From notepad, would you just be pasting it into Appian via an interface, or uploading the text file, etc? I suppose either could work but would require a slightly different approach up-front.
Either way, presuming that you end up with all 45,000 rows within a PV, preferably parsed into a CDT array, you can create a looping process that takes, for example, 1,000 rows at a time, stores them in a separate PV (overwriting the previous contents of that separate PV), writes that to the DB, then moves onto the next set. I've done this before and assuming it's not something that needs to be done routinely / constantly, it should work well.
Note that if this process is something that will need to be done more often, you will need to consider further safety checks (like handling empty data, making sure the loop doesn't try to execute infinitely many times, etc).