Hi All,
We have a process report consisting of more than 1 lakh records. We are trying to create a process model that can fetch the data from report and add it to CSV file. For that, We have used "Export Process report to CSV" smart service. Now, This smart service only fetch 10000 records at a time, Due to which we are not able to fetch whole report data at once. Can someone please guide us about the best way to achieve it? Thanks in advance
Discussion posts and replies are publicly visible
Is it possible for you to logically partition the data? You can apply a filter as part of the configuration of the smart service which might allow you to extract the full data by getting it in chunks, each of which is <10k records, and then merging the output.
I can use the date base filter but than the question is how we will decide upto which limit the loop should be executed? and how can we apply the loop on that smart service?
Unless you can examine the generated CSV file generated and count the rows then perhaps you'll have to run on instance for a know set of "slices" of the data and accept that some may just be empty (e.g. if your data can be partitioned by day and you're only interested in the last 7 days then run one instance per day and then merge the 7 data-sets generated)
You say partition the data, You mean creating different report and dividing the data in them based on some filter? If such is the case, Than it will increase the manual efforts for us. Especially when we have more than 1 lac data. However, Can we add row number inside process reports? If yes, Than this will resolve my problem. As I will loop in the report based on indices and get those data.