Hi All,
i am developing an application in whcih I have to fetch all the records from table. now the problem in doing so is while running the expression tule it's failing which i think is failing because i am fetching everything at a time.
I want to fetch it in batches how can I make it in such a way that it fetch in batches but while writing, it should write it all at once. And also one thing this table will increase everyday by 5-7k rows.
And also if i fetch data in batches won't it hit db multiple times ? What should be preffered batch size ?
Please suggest me good practice to achieve the above.
Discussion posts and replies are publicly visible
Hi!
Please could you add more information regarding the use case?
Could you add a screenshot of the error you get while fetching?
Could you also explain why do you need to fetch everything at the same time?
Are you using records?
Thanks
HI,
Use case is I want to store the appian process metrics and task metrics data from the Appian report to Oracle DB.
Now in order to achieve that I am running my process everynight and what it does is it first fetch all the process metrics data from table and then from Appian process metrics report and after that it finds the unique enteries and then write to database.. this is the use case..and pfb the ss for the erroR:
Why would you like to copy data from reports every day to the DB? What is the purpose?
we want to keep the data to be used for further analysis by other teams,, in appian data would not be persisted always ryt ? and we need that data so keeping it in a database.
I would recommend according to the data you need, to create a specific table and saving/updating data directly from the process models and not trying to do it duplicating a report. Are you auditing changes? that might could be the data you need..
thats what I am doing buddy,, but in order to not writing duplicate data i have to compare the appian report with the existing data in db ryt ? so while doing so first I am fetching everything from the db and then comparing and now while fetching it's causing issue due to big size.