If I have 1000 records that I want to add to the database in smaller chunks, how should I design a process to accomplish this?
I have a few potential solutions, but I'm unsure about their performance implications:
However, I need clarification on the most suitable approach in terms of performance before implementing this functionality.
Discussion posts and replies are publicly visible
Is that a one-time job? If not, how often is that import triggered? Only once at a time, or can there be multiple at the same time?
In general, my best experience is: XLS->Temp DB table->Stored Procedure->Final table.
Try to avoid any looping inside Appian using expressions or processes.
That's what I do. The reason is, that I can make the temp table of text fields only, and then do the data type transformation in the stored procedure.
If you need to run multiple jobs at the same time, add a separate column to store a unique identifier to identify all rows belonging to that job.
I didn't understand. For example, suppose I have 6000 records in my Excel spreadsheet. On the other side, I have an employee table with three columns: ID, name, and email, which also exists in the Excel spreadsheet.
Now, I want to perform an action where a process will store data in chunks from Excel to the database. The smart service will send 1000 data entries at a time. How will the stored procedure work in this scenario?
Create a custom paging as below
if( ri!index = 1, a!pagingInfo( startIndex: 0, batchSize: 100 ), a!forEach( items: enumerate(ri!index) + 1, expression: if( fv!index = 1, a!pagingInfo( startIndex: 0, batchSize: 100 ), a!pagingInfo( startIndex: 100 * (fv!item - 1), batchSize: 100 ) ) ) )
Next create a stored Procedure using LIMIT, Pass the above batch size to this procedure , so this procedure would bring only certain number of rows from the staging table using the batch size, then run the actual stored procedure to push data from staging table to main transaction table.
Konduru Chaitanya ,
Okay let me try first if i need help then i will definetly reach out to the discussion. Thanks