Data upload by file

Hello

I comment a little on the need of the client and how we have proposed development.

We need to do a reading in some daily cases to update business data in our Appian tables, the client leaves us in a directory for different SFTP, access those folders to get the csv, the recorders through scheduled processes and our tables in appian through constructors (rule) that goes through the csv and validate if the records exist in our tables or not, preparing an array to perform the insert or update.

From a development point of view everything works fine, but we have problems when the data volume is high, one of the processes is 155,000 lines with its existence validation in Appian tables. The process hangs without giving an error.

Would we like to ask ourselves if the way to approach this development is the correct one? Can we improve our processes?

Thank you

  Discussion posts and replies are publicly visible

  • Hey Alex,

    I will suggest having a process where you read the file and put it in staging table before writing into your main tables. This way, you dont lose any data and your process model won't get the overloaded.

    It depends on what you are importing also. If you are importing a CSV file, you can use the CSV to SQL plugin. I am not sure if this is still available.

    After that you can have a process that pulls those in the database and write into your main tables - which is time triggered. You can have a status column on your staging table so you can keep track which data has already been processed.

    Additionally, I would suggest having a threshold on the number of rows you want to process. I believe you can only do 1000 rows if I am not mistaken.

    Hope this helps.

    Lejanson

  • 0
    A Score Level 1
    in reply to legotx

    Thx Lejanson.

    I think it's a good idea.