Hi Experts,
Earlier we were using Export SQL to Excel for generating the Excel from the db having 387 columns.
This particular service has got deprecated, how can we get this undepreciated or if there is any other workaround to generate the same.
Tried using Export Data Store Entity to Excel but it not working beyond 50 columns, when trying to break the colomn into set of 50 and then trying to right in next 50 column the previous column data is getting deleted.
Discussion posts and replies are publicly visible
Do you mean 387 rows? Or do you honestly mean that each row contains 387 named data elements?
Its 387 columns David, rows are more than 50K.
Export data store entity to Excel have limitation of 50columns hence can't use that smart service.
Also to confirm, is this a one-time data load, or are you building a process to import this massive excel file regularly?
Yeah having some more information would be helpful for us here. Why do you need all those columns? What are you doing with the data? There are other ways of getting data out of Appian that may be better depending on your use case.
Hi Chris,
This will be weekly import process, I have thought on following approaches for importing the data,
1: Reading the data with readexcelsheetpaging() function and then transform it into custom data type and then write it to db using Write to data store entity, with this approach we are able to contain our RequestId in the records. Here problem is when the data set it to large it brings down the server
2: For larger data set I'm are trying to use the same smart service in a script task with batch size of 1000 row at a time, after writing it to db I'm chceking if it last batch is matching the count, if not then looping back to script task to read next 1000 lines and transform into cdt and then write.
3: Exporting the data to a staging table using Export Excel to Db and then moving data from staging table to main table along with Request Id.
Please suggest if there can be any better approach.