Hi,
I am getting huge amount of data and needs to write it in database. Its taking so much time
Have used
a!writeTodataStoreEntity(dataStoreEntity:cons!APM_DSE_VALUES, valueToStore: local!data)
In local!data am calling rule
Can someone help me how to convert it in forEach loop or in batches.
Please help with code
Discussion posts and replies are publicly visible
Are you able to quantify what "a huge amount of data" is? The problem with batching up the writes is that an individual iteration may fail, and that might be a problem if you need the overall persistence of your dataset to be transactional (i.e. all or nothing). I'd try and address the performance issue in a single write first before looking for alternate design options. Are you able to determine where the time is being spent in the write? Is it in the database? Is it in the network? Is it in Appian itself?
Thanks Stewart Burchell for replying
By Huge amount I mean its in 50-60 lakhs of rows.
Its taking around a minute to execute it.
In Healthcheck report its showing issue with datastore.
So the solution which i can think of is to write in batches or by looping.
It seems like you do this in an interface. Storing such volumes of data in an interface is a bad idea and will not scale. I suggest to try to find a different solution.
Ya it was configured previously by some other developer, and this issue was not taken care of at that point of time, thats why am looking for different approach to resolve it.
If you have any other do let me know. Thanks
To be able to propose any alternatives, we need a detailed description of the scenario.
That volume does seem extreme. Next question: does it matter that it's taking 1 minute to execute? That is: is it material in any way? Is thee a user who is waiting at the front end for the write to complete? (note that just because Healthcheck is calling it out doesn't make it inherently bad - Healthcheck calls out risks not issues - it's up to you to determine if 1 minute is really a problem. It would be a problem if, as described, someone was waiting for the write to complete, but even then you could redesign to manage the user's expectations e.g. a message that says "We're writing the data, it may take a few minutes, we'll let you know when it's complete". And/or if it's ok running as a background task does this happen infrequently or could there be a lot of concurrent writes of the same volumes running, which would also present a problem.
I am using Web Api to pull data from third party and after pulling those data we are using writeToDataStore to dump all those data in table. Since the data that we are writing is too large so its taking time, due to which its impacting.
After healthcheck report we figured out the risk. Now we are looking for some solution so that it will take less time when we are writing those data to database.
yes Stewart Burchell its hampering/making the application slow, thats why we are looking for some way to minimize the time
Does that API support paging? If you, just create a synced record using that API as the source.
docs.appian.com/.../Service-Backed_Record_Tutorial.html
I echo Stewart and Stefan. Inserting so much data in real time will be slow.
If you want to keep your current approach, it would be best to switch to an asynchronous write via startProcess.
Otherwise, it would interesting to know the reason why you are calling the API and writing so much data to a table and why you can't use a synced record as Stefan has suggested.
Thanks Mathieu Drouin for responding. Actually I also understand that inserting so much data will cause issue, but sometimes its the requirement which you have to follow if business people are rigid.
Yeah, well there are some limitations that you need to take into account. Inserting a large number of rows synchronously will be slow. Not much you can about it aside from running it asynschronously in the background while the user is doing other things.