We are looking to use the queryprocessanalytics expression for pulling data back

We are looking to use the queryprocessanalytics expression for pulling data back on a large number of processes, using the query parameter. We plan to pass an array of potentially hundreds or thousands of integers into the a!query filter parameter using "in" for the operator parameter. Can anyone clarify whether there is any limit on the a!query expression's filter parameter?

OriginalPostID-200095

OriginalPostID-200095

  Discussion posts and replies are publicly visible

  • @matthewb That's an interesting question. I have come across this particular scenario while I was trying to export tens of thousands of rows from portal report by making use of a!queryProcessAnalytics().

    Here goes one of the various approaches I have tried, it's actually weird but it answers your question to some extent:

    1. Write an expression rule that gets 10000 (as 10000 is the max limit as per the value of 'resources.appian.analytics.application.maxreportrows=' setting and number of engines in our environment) records. This expression rule has got the capability to accept the filters (generally the identifiers of the records). It has also got the capability to append the final output to a variable at the end of the call, which would be helpful for populating the accumulator when invoked in association with fn!reduce().

    2. Invoke the function created in step - 1 along with fn!reduce(). As the output is going to be appended to the accumulator at the end of each call, the identifiers of the data in the accumulator are applied as filters ("not in") on the next call to the a!queryProcessAnalytics().

    Shortly speaking, the above expression works as follows for querying 40000 records:

    >> Get 1 to 10000 records (There isn't a filter initially as the accumulator will be empty for the first call). Append the results to accumulator and there will be 10000 records by now.
    >> Get 1 to 10000 records where (Here the accumulator consists of 10000 records). Append the results to accumulator and there will be 20000 records by now.
    >> Get 1 to 10000 records where (Here the accumulator consists of 20000 records). Append the results to accumulator and there will be 30000 records by now.
    >> Get 1 to 10000 records where (Here the accumulator consists of 30000 records). Append the results to accumulator and there will be 40000 records by now.

    The above approach might seem weird to many and one might even get a question as of why we aren't following batching technique such as a!pagingInfo(startIndex:1, batchSize:10000), a!pagingInfo(startIndex:10001, batchSize:10000) and so on. Query analytics throws an exception as soon as it sees a value greater than 10000 and that's why the above approach has been opted for.

    So, in the above way we tried applying 100000+ values as filter values for "in" and "not in". And the results are completely erroneous and I wasn't able to draw a pattern out of it. As per my observations, the problem used to occur when more than 30000 filter values are applied along with "in" or "not in". And in case of values are around 80000 or 100000 etc, the behavior is completely unpredictable. Sometimes we experienced issues even for 15000 and 20000 records as well, at times. Out of my experience, I would suggest refraining from applying those many values as filters. If this would have worked for me, I would have ended up in creating a generic and reusable rule which will query any number of records trespassing the 10000 limit. But because of experiencing a weird behavior, we have excluded this approach of applying huge filters values with "in" or "not in".

    FYI, the behavior I have observed is on 7.8 and the function I have used is a!queryProcessAnalytics().

    Just in case, if you still want to go ahead with applying thousands of filter values, I would suggest testing the approach by keenly observing the resultant datasets. You may also plan for an alternative but it depends on the usecase and there could be downside that the approach might not be reusable or generic.

    Hope the above information is of some help to you.
  • Thanks for your response! It sounds to me like even if I could pass an unlimited number of items in a filter, that my resulting data set would be limited to 10000 anyways. I'll implement what you described, or at the very least limit our users to 10000 records.