Hi,
We have a requirement to write over 50 000 rows daily to our MariaDB database.
I want to know what is the best way to get this done. We tried using a Record Type but the data was too large for it to handle and a breaker cancelled the synchronisation.
I have seen that creating Multiple node instances is not recommended as we will create an excessive connexion pool with the DB.
The only solution that comes to mind is to have a subprocess that gets fed smaller batches (100 or 1000 rows) looping through the data for insertion (sequential).
Discussion posts and replies are publicly visible
My first question: Where is that data coming from, and in which format?
It's coming from a data management platform which is consumed by Appian through a connected system. The format Appian gets it in is a map which we cast to whatever we need. We are obliged to use this data management platform as a hub for all data queries.
The use case is that we need the data as read-only but the request response time is very large so we are trying to find a way to "cache" it daily for Appian to be able to use it in dropdowns and dashboards.
I'm considering to call it as a local variable in interfaces but that creates large overhead for each interface that makes the call.
Another option is to call it in a process and create a process report which gets archived daily.
That sounds like a perfect use case for synced records?!?
I came to the same conclusion hence the following error
2022-11-03 15:21:59,404 [ajp-nio-0.0.0.0-8009-exec-444] ERROR com.appiancorp.rest.shared.AppianExceptionMapper - Internal Server Error on REST API invocation. com.appiancorp.exceptions.LocalizedAppianRuntimeException$LocalizedAppianException: Expression evaluation error [evaluation ID = 14504:8b784] : The Memory Circuit Breaker prevented this evaluation from completing due to insufficient resources. Please contact your system administrator. (APNX-1-4510-000) (APNX-1-4510-000)
That sounds like you need to set up the sync in batches. That is directly supported by Appian. Does the API support it as well?
You are correct, I simply needed to create batches it solved my issue. The API is capable of batching.