for example my array size is 3400
in the first write - 1 to 10002nd time write 1001 to 20003rd time write 2001 to 30004th time write 3001 to 3400
How we can achieve this one
can anyone help me out
TIA
Discussion posts and replies are publicly visible
You can use Loop and Paging to get 1000 records at a time, But 3400 records in Variable it's bit concerning Can i know the size of the data. Appian isn't meant for Large data Processing. You might get Timeout exceptions and performance issue (It occupies more space).Bulk inserts and Updates through Data Store entities
Hi sindhup7044
1)If the data to write can be get from the query, you can use looping logic in process model. Have a process variable to hold startIndex and totalCount. Then query the 1000 or I would suggest 500 data then write to Database then Increment the startIndex and in Xor node check if StartIndex>TotalCount then end the process else continue incrementing do the loop.
There are different cases based on the filters you apply. Say if query with a filter status =A and then you updated status = B. Then in this case you need to modify loop logic. The above one is more generic.
2) if you have a list already from a process or form, then use startIndex and totalCount varaibles in loop as like above
Hi sindhup7044 may I know how is your data arranged in your system. Like more details on from where you are getting those 3400 array of data and where are you trying to store and how are you trying to store. That would give and idea to pick the right approach
I would suggest to go with stored procedure that has the logic to update or insert entries to the table.
Technically the Write Records Smart Service supports up to 50 000 records at a time.
https://docs.appian.com/suite/help/24.3/Write_Records_Smart_Service.html#setup-tab
As others mentioned, it really depends on the amount of data you have in each of those 3400 records which could ultimately impact performance. You also need to take into consideration the data retention. If you load a lot of data in PVs, you'll probably want to avoid archiving that process to save on disk space.
Prepare Array: Create an array with 3400 records.
Define Batch Size: Set batch size (e.g., 1000).
Use Loop:
Write to Data Store Entity
forEach( items: a!arrayRange(1, ceiling(length(recordsArray) / batchSize)), expression: a!writeToDataStoreEntity( dataStoreEntity: cons!YOUR_DATA_STORE_ENTITY, valueToStore: a!arraySlice(recordsArray, (fv!item - 1) * batchSize + 1, fv!item * batchSize) ) )
Thanks Shanmathi it worked
Good to know!!