Best Practice to call Rest API in Appian

Certified Lead Developer

Hi All,

I have one requirement where I need to call a Rest API, parse the response and persist the data for all the number of rows present in one of our tables. Currently we have around 4000 rows in the table for which I need to call the API, so it means 4000 API calls.

I am seeking answers to below questions:

1. Is it feasible in Appian? (Currently if I am trying to hit the API its going in hung state)

2. What could be the potential effect on our environment?

3. What is the best possible way to achieve this? (Most IMP)

 

Thanks

  Discussion posts and replies are publicly visible

Parents Reply Children
  • 0
    Certified Lead Developer
    in reply to aloks0189
    Let me clarify with an example:

    Example - I have 4000 Customer Ids in DB and for each Id I need to call a third party Rest API to get customer relevant data in XML format which I need to parse and then store the customer details (from XML) in Appian DB
  • 0
    Certified Lead Developer
    in reply to PGarg
    As per my understanding, here we can have only 2 possibilities to invoke the API
    1) 1 API call for each iteration (which is not a good idea in terms of performance)
    2) If you want to avoid the above approach, then you might need to do some customization from API side. If customization is possible (i mean, if you are the one who has designed the API as well), then try converting multiple rows into JSON and send it to API, where in API side, you need to parse and store the input into an array, collect entire information from API DB> Convert that back into JSON / XML and send that as a response to Appian/Client.

    But this approach is only feasible, if there is any chance to customize the API, else there are no other options apart from 1 API call for each iteration/row

    However, in case you can't customize the API, but also you want to think about your performance, then i would recommend, fetch the data in Batch (let's say 100), once the 100 rows gets processed then you can pause/terminate the process and can restart the process after a couple of hours again, by the help of Timer. Which will make sure that, you are not going to process the entire rows at a time.

    I hope, one of these options might suite your requirement.
  • 0
    Certified Lead Developer
    in reply to aloks0189
    Thanks Alok,

    Customization of API is not possible so this possibility is ruled out
    Yes following batch size approach is what I am currently adhering to, but just wanted to know any better ways.

    Here I am facing one more challenge I need to store the response from each API call to some other table using Stored procedure call OR write the response in some flat file using doc generation smart service, for both the approaches I have to store the response in PV which I need to avoid as response is extremely huge for each call (Multiply 4000).

    Any alternate for the same.

    Thanks
  • Try this approach, if you are planning to store the data in db on some tables.

    Directly execute the Integration rule which fetches the data from API in the Execute Stored Procedure call input rather than passing a pv variable input. In this way you can avoid storing the huge data in pv and in turn its memory efficient.

    Further you can also define the pv variable property as hidden, so that it will not be captured in Process History and it gives better performance. Below link gives further information on the same.

    community.appian.com/.../creating-memory-efficient-models

    Regards,
    Vadivelan