Hi I'm using the export data to Excel plug in to allow users to

Hi

I'm using the export data to Excel plug in to allow users to export the data held in an Appian database to Excel. I've now hit the record limit that can be used in database queries.

I'm wondering what options I have to be able to export 4000+ records to Excel ? (Assuming the export completes in the timelimit)

Any opinions / advise will be greatfully recieved !

Thanks

Paul

OriginalPostID-154621

OriginalPostID-154621

  Discussion posts and replies are publicly visible

Parents
  • @paulc You could opt for one of the approaches specified below:

    1. This approach is with respect to query rules. Implement the batching technique, i.e. retrieve the records in batches where each batch retrieves a specific number of records. Let's take your instance - We have 4000 records to query, Get 1000 records (1 - 1000 i.e. rule!myQueryRule(topaginginfo(1,1000))) in first batch and proceed towards second batch and retrieve next 1000 (1001 - 2000 i.e. rule!myQueryRule(topaginginfo(1001,1000))) records and so on. I guess this is the only viable way to obtain the entire dataset with the limitations imposed by Appian on the way we fetch data by querying RDBMS.

    2. This approach is with respect to query database smart service. I guess there is no other way to retrieve entire dataset other than running the query database smart service for multiple times.

    Although there are ways to increase the limit on the amount of data that can be obtained using query rules or query database smart service, this might not be a best practice.

    Additionally you can try for a performance gain by implementing a!queryEntity() as specified at https://forum.appian.com/suite/help/7.9/Querying_Data_From_an_RDBMS.html.

    And with respect to export time limit, I guess we can't guarantee that the queries will be performed within the intended time limit because the time consumed by queries is directly proportional to the amount of data we are trying to query.

    Hope that helps you to some extent.
Reply
  • @paulc You could opt for one of the approaches specified below:

    1. This approach is with respect to query rules. Implement the batching technique, i.e. retrieve the records in batches where each batch retrieves a specific number of records. Let's take your instance - We have 4000 records to query, Get 1000 records (1 - 1000 i.e. rule!myQueryRule(topaginginfo(1,1000))) in first batch and proceed towards second batch and retrieve next 1000 (1001 - 2000 i.e. rule!myQueryRule(topaginginfo(1001,1000))) records and so on. I guess this is the only viable way to obtain the entire dataset with the limitations imposed by Appian on the way we fetch data by querying RDBMS.

    2. This approach is with respect to query database smart service. I guess there is no other way to retrieve entire dataset other than running the query database smart service for multiple times.

    Although there are ways to increase the limit on the amount of data that can be obtained using query rules or query database smart service, this might not be a best practice.

    Additionally you can try for a performance gain by implementing a!queryEntity() as specified at https://forum.appian.com/suite/help/7.9/Querying_Data_From_an_RDBMS.html.

    And with respect to export time limit, I guess we can't guarantee that the queries will be performed within the intended time limit because the time consumed by queries is directly proportional to the amount of data we are trying to query.

    Hope that helps you to some extent.
Children
No Data