Need to handle 35-40 MB excel data with 100+ columns

Certified Associate Developer

I have a requirement where I have to process, filter and Validate 35-40 MB  data in excel with 100+ columns and if validation is successful then Import the data into Landing table.

What could be the best way to design this. We don't have any control on excel to create multiple excel files and upload. It has to be done with all.

We are already managing 2K rows with process & validate but Customer wants Appian to do filtering and then process. But After filtering we need to do validation on resultant data after filter and inform user via popup or message in UI.

  Discussion posts and replies are publicly visible

  • Hi,

    It seems likely that this is 35-40 MB data in Excel with 100+ columns is an Export taken from another system. If so, worth exploring if Appian can directly integrate with this other system so the other system can periodically 'push' data to Appian such as by invoking an Appian WebAPI or Appian can 'pull' data from other system assuming the other system makes avaialble an API to integrate. A direct integration between Appian and other system would allow programmatic control over aspects such as batch size, frequency of sync, etc. Therefore, would be preferred over using large Excel files. 

    If this is not possible, given the amount of time involved with processing, filtering and validating 35 - 40 MB data in Excel, suggest use of Asynchronous pattern of 'Refresh Until Asynchronous Action Completes' to process, filter and validate the Excel file in the background, and display the results to the user once the 'asynchronous' series of activities completes.