Best Practice to call Rest API in Appian

Hi All,

I have one requirement where I need to call a Rest API, parse the response and persist the data for all the number of rows present in one of our tables. Currently we have around 4000 rows in the table for which I need to call the API, so it means 4000 API calls.

I am seeking answers to below questions:

1. Is it feasible in Appian? (Currently if I am trying to hit the API its going in hung state)

2. What could be the potential effect on our environment?

3. What is the best possible way to achieve this? (Most IMP)

 

Thanks

  Discussion posts and replies are publicly visible

Parents
  • 0
    Certified Lead Developer
    One Question, before i proceed to provide any suggestion:

    Does the Service side DB(Where the REST API is exposed) contains 4000 rows, or Client Side (Appian/Your system). Because as per above description, what i have understood is, there are 4000 rows available and will be returned by the REST API which you would like to parse and store into Appian Mapped DB.


    Please let me know, whether i understood your requirement or not.

  • Client side (Appian DB) has 4000 rows and we need to call Rest API 4000 times.
  • 0
    Certified Lead Developer
    in reply to PGarg
    So, Does that mean, you want to send the Appian data to the REST API?
  • Let me clarify with an example:

    Example - I have 4000 Customer Ids in DB and for each Id I need to call a third party Rest API to get customer relevant data in XML format which I need to parse and then store the customer details (from XML) in Appian DB
  • 0
    Certified Lead Developer
    in reply to PGarg
    As per my understanding, here we can have only 2 possibilities to invoke the API
    1) 1 API call for each iteration (which is not a good idea in terms of performance)
    2) If you want to avoid the above approach, then you might need to do some customization from API side. If customization is possible (i mean, if you are the one who has designed the API as well), then try converting multiple rows into JSON and send it to API, where in API side, you need to parse and store the input into an array, collect entire information from API DB> Convert that back into JSON / XML and send that as a response to Appian/Client.

    But this approach is only feasible, if there is any chance to customize the API, else there are no other options apart from 1 API call for each iteration/row

    However, in case you can't customize the API, but also you want to think about your performance, then i would recommend, fetch the data in Batch (let's say 100), once the 100 rows gets processed then you can pause/terminate the process and can restart the process after a couple of hours again, by the help of Timer. Which will make sure that, you are not going to process the entire rows at a time.

    I hope, one of these options might suite your requirement.
  • Thanks Alok,

    Customization of API is not possible so this possibility is ruled out
    Yes following batch size approach is what I am currently adhering to, but just wanted to know any better ways.

    Here I am facing one more challenge I need to store the response from each API call to some other table using Stored procedure call OR write the response in some flat file using doc generation smart service, for both the approaches I have to store the response in PV which I need to avoid as response is extremely huge for each call (Multiply 4000).

    Any alternate for the same.

    Thanks
  • Try this approach, if you are planning to store the data in db on some tables.

    Directly execute the Integration rule which fetches the data from API in the Execute Stored Procedure call input rather than passing a pv variable input. In this way you can avoid storing the huge data in pv and in turn its memory efficient.

    Further you can also define the pv variable property as hidden, so that it will not be captured in Process History and it gives better performance. Below link gives further information on the same.

    community.appian.com/.../creating-memory-efficient-models

    Regards,
    Vadivelan
Reply Children
No Data