Hi,
I am getting huge amount of data and needs to write it in database. Its taking so much time
Have used
a!writeTodataStoreEntity(dataStoreEntity:cons!APM_DSE_VALUES, valueToStore: local!data)
In local!data am calling rule
Can someone help me how to convert it in forEach loop or in batches.
Please help with code
Discussion posts and replies are publicly visible
Are you able to quantify what "a huge amount of data" is? The problem with batching up the writes is that an individual iteration may fail, and that might be a problem if you need the overall persistence of your dataset to be transactional (i.e. all or nothing). I'd try and address the performance issue in a single write first before looking for alternate design options. Are you able to determine where the time is being spent in the write? Is it in the database? Is it in the network? Is it in Appian itself?
Thanks Stewart Burchell for replying
By Huge amount I mean its in 50-60 lakhs of rows.
Its taking around a minute to execute it.
In Healthcheck report its showing issue with datastore.
So the solution which i can think of is to write in batches or by looping.
It seems like you do this in an interface. Storing such volumes of data in an interface is a bad idea and will not scale. I suggest to try to find a different solution.
Ya it was configured previously by some other developer, and this issue was not taken care of at that point of time, thats why am looking for different approach to resolve it.
If you have any other do let me know. Thanks
To be able to propose any alternatives, we need a detailed description of the scenario.