Hi,
I have a table having 15 columns(Primary key constraint Id Auto increment) and my process is inserting 11701 rows, write to data store is taking 6 minutes to insert these rows.
Is it fine or what i can do to fix it.
pls find attached process node details.
Discussion posts and replies are publicly visible
I would consider writing 11701 rows from a process to be a design issue. What is the use case?
6 Minutes seem to be OK.
Hi Stefan,
There is a datasource with huge data for projection, after applying certain filters we are getting 11701 rows, sometimes it could be less or more numbers.
we need to show data as preview in readonly grid, user can edit data after selection.finally user will submit, on submitting there is a processs which gets triggered ,a write to data store entity executed on same processs.
I would try to implement that using a staging table to temporarily hold the data. After preview/edit, copy it using a stored procedure.
In general, Appian processes is not the best option to handle this kind of tasks.