Hello Everyone, We have a scenario where an user will upload an excel

Hello Everyone,

We have a scenario where an user will upload an excel file containing data and system will parse the data and write the details to the database.
Each record in the excel corresponds to a record in the data base. We are using the Read Excel plugin to parse and Appian Write to data stores to write to the database.
There is no upper limit on the number of records in the excel. User can upload 40000 or even more records. If we try to write such large data sets into the database, it is causing a server outage.

Also, after the data is written to the database, an approval task is generated. In this task, an approver can upload a new version of the Excel File.
If a new version is available, then all the records written to the database earlier must be deleted and new records will be written for the case.

We would like to know the various design approaches we can consider to write and delete large number of records into the database.

OriginalPostID-189203

OriginalPostID-189203

  Discussion posts and replies are publicly visible

Parents
  • @narasimhaadityac The Excel will not have primary keys and will be constructed as part of the write operation. When the user uploads a new file, all the existing records from the database needs to be deleted and new records has to be written to the database. We need not check if any records are modified. The requirement is to delete all old data and write new records for the new file.
Reply
  • @narasimhaadityac The Excel will not have primary keys and will be constructed as part of the write operation. When the user uploads a new file, all the existing records from the database needs to be deleted and new records has to be written to the database. We need not check if any records are modified. The requirement is to delete all old data and write new records for the new file.
Children
No Data