Hi All,
i am developing an application in whcih I have to fetch all the records from table. now the problem in doing so is while running the expression tule it's failing which i think is failing because i am fetching everything at a time.
I want to fetch it in batches how can I make it in such a way that it fetch in batches but while writing, it should write it all at once. And also one thing this table will increase everyday by 5-7k rows.
And also if i fetch data in batches won't it hit db multiple times ? What should be preffered batch size ?
Please suggest me good practice to achieve the above.
Discussion posts and replies are publicly visible
You can, and should not, try to load "all" rows. Please help us to understand what you want to achieve.
so there is a table which my appian process is updating from the appian reports.Now in order to update it, it's first fetching everything from the table and then fetching data from the appian report and after comparing and finding the unique values it's writing those rows into the table. this is what I want to achieve.
I understand. Does not work. You will have to find a way to NOT load ALL data and try to manipulate it in Appian memory.
Without knowing what all this data manipulation is for, an idea might be to first store that data to a temporary table and then run a stored procedure to do that merging operation.
my ultimate goal is to write unique rows in the table.. so that can be done by using procedue as well ryt ? do you have any example for that like how to write data passed from appian using procedure.
I do not have an example at hand. Writing a stored procedure for MariaDB/MySQL is covered in stackoverflow.
docs.appian.com/.../Execute_Stored_Procedure_Smart_Service.html