Want to add to the database in smaller chunks, Need to design process

Certified Senior Developer

If I have 1000 records that I want to add to the database in smaller chunks, how should I design a process to accomplish this?

I have a few potential solutions, but I'm unsure about their performance implications:

  1. Implementing a loop: In each iteration, I would send data in manageable chunks.
  2. Utilizing an XLSX file: I could create an XLSX file and then use the Excel to Database smart service to transfer the data.
  3. Using a stored procedure.

However, I need clarification on the most suitable approach in terms of performance before implementing this functionality.

  Discussion posts and replies are publicly visible

Parents Reply Children
  • 0
    Certified Lead Developer
    in reply to Ankush Sharma

    That's what I do. The reason is, that I can make the temp table of text fields only, and then do the data type transformation in the stored procedure.

    If you need to run multiple jobs at the same time, add a separate column to store a unique identifier to identify all rows belonging to that job.

  • 0
    Certified Senior Developer
    in reply to Stefan Helzle

    I didn't understand. For example, suppose I have 6000 records in my Excel spreadsheet. On the other side, I have an employee table with three columns: ID, name, and email, which also exists in the Excel spreadsheet.

    Now, I want to perform an action where a process will store data in chunks from Excel to the database. The smart service will send 1000 data entries at a time. How will the stored procedure work in this scenario?



  • 0
    Certified Senior Developer
    in reply to Ankush Sharma

    Create a custom paging as below

    if(
      ri!index = 1,
      a!pagingInfo(
        startIndex: 0,
        batchSize: 100
      ),
      a!forEach(
        items: enumerate(ri!index) + 1,
        expression: if(
          fv!index = 1,
          a!pagingInfo(
            startIndex: 0,
            batchSize: 100
          ),
          a!pagingInfo(
            startIndex: 100 * (fv!item - 1),
            batchSize: 100
          )
        )
      )
    )

    Next create a stored Procedure using LIMIT, Pass the above batch size to this procedure  , so this procedure would bring only certain number of rows from the staging table using the batch size, then run the actual stored procedure to push data from staging table to main transaction table.

  • 0
    Certified Senior Developer
    in reply to Konduru Chaitanya

     ,

    Okay let me try first if i need help then i will definetly reach out to the discussion. Thanks Slight smile

  • 0
    Certified Senior Developer
    in reply to Konduru Chaitanya

      ,

    So now right now what i did i have created an expression rule where i am faching data from excel and as per the condition right now whenever i am changing the indexing then it will increase the batchsize by 1000 because i have 6000 records in excel sheet 

    a!localVariables(
      local!startIndex: 1,
      local!batchSize: 1000,
      readexcelsheetpaging(
        excelDocument: cons!APO_DOCUMENT,
        sheetNumber: 0,
        pagingInfo: {
          a!forEach(
            items: enumerate(local!startIndex) + 1,
            expression: if(
              fv!index = 1,
              a!pagingInfo(startIndex: 1, batchSize: local!batchSize),
              a!pagingInfo(
                startIndex: local!batchSize * fv!item,
                batchSize: local!batchSize
              )
            )
          )
        }
      )
    )



    Now i am going to create storedprocedure for to excute this. Hopepully i am going into the right path