Best Practice to write data from Excel to Database

Hello All,

 

I have a scenario, where i am trying to write 1,00,000+ rows of data from Excel to my Database.

Can anyone suggest me on what is the best practice to do this?

 

Thanks.

  Discussion posts and replies are publicly visible

Parents
  • Is this a one-time batch import or a regular occurence?

    • If it is a one time batch-import, your DBA would probably be able to help you use the DB's specific tools (e.g. SSMS for MSSQL) and get that done painlessly for you.
    • If it's a regular occurrence, I would ideally use specialised tools and orchestrate the workflow with Appian if and when required.
    • If, for some reason, you want to do this batch-import in Appian, you could transform/save as the Excel as a csv, split the rows and then write to datastore the individual elements of the row mapped to the requisite CDT. While this works in principle I'm not certain this is something I would do with the amount of data you are talking about.
Reply
  • Is this a one-time batch import or a regular occurence?

    • If it is a one time batch-import, your DBA would probably be able to help you use the DB's specific tools (e.g. SSMS for MSSQL) and get that done painlessly for you.
    • If it's a regular occurrence, I would ideally use specialised tools and orchestrate the workflow with Appian if and when required.
    • If, for some reason, you want to do this batch-import in Appian, you could transform/save as the Excel as a csv, split the rows and then write to datastore the individual elements of the row mapped to the requisite CDT. While this works in principle I'm not certain this is something I would do with the amount of data you are talking about.
Children
No Data