If I have 1000 records that I want to add to the database in smaller chunks, how should I design a process to accomplish this?
I have a few potential solutions, but I'm unsure about their performance implications:
However, I need clarification on the most suitable approach in terms of performance before implementing this functionality.
Discussion posts and replies are publicly visible
Hello Ankush Sharma
Doing the following would not put much impact on Performance.1. Get the data from your excel to a staging/temporary table
2. From this temp table, create a paging and get 100 records per time in an ER and use Stored procedure and write it to the actual table.
Is that a one-time job? If not, how often is that import triggered? Only once at a time, or can there be multiple at the same time?
In general, my best experience is: XLS->Temp DB table->Stored Procedure->Final table.
Try to avoid any looping inside Appian using expressions or processes.
Hi ,Thanks for replying, yes i have bulk of data i don't want to send at one time i want to send data in smaller batch size. I was thinking previosly that it can be handled by looping but looping will impact the performace and i think we avoid using looping over the process
That's what I do. The reason is, that I can make the temp table of text fields only, and then do the data type transformation in the stored procedure.
If you need to run multiple jobs at the same time, add a separate column to store a unique identifier to identify all rows belonging to that job.
I didn't understand. For example, suppose I have 6000 records in my Excel spreadsheet. On the other side, I have an employee table with three columns: ID, name, and email, which also exists in the Excel spreadsheet.
Now, I want to perform an action where a process will store data in chunks from Excel to the database. The smart service will send 1000 data entries at a time. How will the stored procedure work in this scenario?
Create a custom paging as below
if( ri!index = 1, a!pagingInfo( startIndex: 0, batchSize: 100 ), a!forEach( items: enumerate(ri!index) + 1, expression: if( fv!index = 1, a!pagingInfo( startIndex: 0, batchSize: 100 ), a!pagingInfo( startIndex: 100 * (fv!item - 1), batchSize: 100 ) ) ) )
Next create a stored Procedure using LIMIT, Pass the above batch size to this procedure , so this procedure would bring only certain number of rows from the staging table using the batch size, then run the actual stored procedure to push data from staging table to main transaction table.
Konduru Chaitanya ,
Okay let me try first if i need help then i will definetly reach out to the discussion. Thanks
Konduru Chaitanya ,So now right now what i did i have created an expression rule where i am faching data from excel and as per the condition right now whenever i am changing the indexing then it will increase the batchsize by 1000 because i have 6000 records in excel sheet
a!localVariables( local!startIndex: 1, local!batchSize: 1000, readexcelsheetpaging( excelDocument: cons!APO_DOCUMENT, sheetNumber: 0, pagingInfo: { a!forEach( items: enumerate(local!startIndex) + 1, expression: if( fv!index = 1, a!pagingInfo(startIndex: 1, batchSize: local!batchSize), a!pagingInfo( startIndex: local!batchSize * fv!item, batchSize: local!batchSize ) ) ) } ) )