Hello All,
I am having a requirement to import lakhs of rows data into the Appian database by using the smart services as we need to give the file upload functionality to the user.
Can anyone suggest the best way to achieve with the best performance.
Thanks in advance.
Discussion posts and replies are publicly visible
I would recommend batching the data here. You can create batches of 1,000-10,000 each time and dump the data in an MNI or process loop.
How Many columns will be there in the file?
You can try import excel to DB smart service but have to check it's limitation for example it is fisible if the excel column count is less than 50. or You can read the excel and map the excel data with in the CDT and take the batches of the variable by using toDatasubset and then pass the subset data to a new process model which is handling the data insertion. In this case you can use start Process smart service as it will be asynchronus and it will run on the seperate engine which will help in load distribution.