Hi,
am working on a use case where, service backed record is supposed to fetch data, used to display and to be fed to corresponding Reports. The provider system is having around 2 million records, and the record sync is timing out (100000 records synced in 40 minutes). Additionally for every call 1000 records are returned.
Requirement here being to sync the entire 2 million records in Appian record type.
In case you'll have experienced something similar or have a solution/workaround for the scenario, request to please advice.
Thanks in advance.
Regards,
Amit
Discussion posts and replies are publicly visible
Did you check the documentation about sync issues?
docs.appian.com/.../Records_Monitoring_Details.html
Hi Stefan,
The challenge here is that we must get the dataset on Appian. the approach that we took,
going by the sync option: we are taking around 40 min to fetch 100000 records with 1000 rows per batch.
going by real time API calls rather than sync: every time call the API.
with the limit of 1000 records, would there be any feasible way to sync the data?
Thanks,
I try to understand your problem. Fetching 100k records takes 40 minutes. Fetching 2000k would then take around 13 hours which exceeds the sync limit. Correct?
To get an idea of what the best option would be, we need more details about the datasource. Would a real time call be an option? Why is the sync so slow? Why does each sync call take 24 seconds?
yeah that's correct, the data source is GraphQL, and API call is taking somewhere between 15-25 sec, just to get an idea when we tried with 100K records it took 40 minutes.
we are open to suggestion on real time API call too!! is the limit to fetch 1000 per batch could be increased by any means.
So, it seems like the automatic full sync will not work for your. There are smart services which allow you to tell Appian which records have changed. Maybe you can build a process loop around it. That should not be limited by this full sync time limit.