Want to add to the database in smaller chunks, Need to design process

Certified Senior Developer

If I have 1000 records that I want to add to the database in smaller chunks, how should I design a process to accomplish this?

I have a few potential solutions, but I'm unsure about their performance implications:

  1. Implementing a loop: In each iteration, I would send data in manageable chunks.
  2. Utilizing an XLSX file: I could create an XLSX file and then use the Excel to Database smart service to transfer the data.
  3. Using a stored procedure.

However, I need clarification on the most suitable approach in terms of performance before implementing this functionality.

  Discussion posts and replies are publicly visible

Parents Reply
  • 0
    Certified Senior Developer
    in reply to Konduru Chaitanya

      ,

    So now right now what i did i have created an expression rule where i am faching data from excel and as per the condition right now whenever i am changing the indexing then it will increase the batchsize by 1000 because i have 6000 records in excel sheet 

    a!localVariables(
      local!startIndex: 1,
      local!batchSize: 1000,
      readexcelsheetpaging(
        excelDocument: cons!APO_DOCUMENT,
        sheetNumber: 0,
        pagingInfo: {
          a!forEach(
            items: enumerate(local!startIndex) + 1,
            expression: if(
              fv!index = 1,
              a!pagingInfo(startIndex: 1, batchSize: local!batchSize),
              a!pagingInfo(
                startIndex: local!batchSize * fv!item,
                batchSize: local!batchSize
              )
            )
          )
        }
      )
    )



    Now i am going to create storedprocedure for to excute this. Hopepully i am going into the right path

Children
No Data