Hi, I have data for 500 rows in external table which I am trying to delete on ba

Hi, I have data for 500 rows in external table which I am trying to delete on batches 50 using "Delete from data store entity" I constructing pv variable of type entityDataIdentifier as {entity:cons!DATA_STORE_ENTITY, identifiers:pv!ResultIDs}. But I am getting health check issues saying that "5500 ms as peak average "

Please suggest on this how to avoid from health check list

OriginalPostID-225807

OriginalPostID-225807

  Discussion posts and replies are publicly visible

  • @janardhnak As per my knowledge, though an operation takes less than 2000ms(As far as I remember, the datastore operations are expected to complete in 2000ms else this may be flagged in Health Check) on one environment at certain point of time, the same may not be guaranteed all the time on the same or other environments as there are various factors that influences this total execution time and especially may vary during peak hours.

    I would suggest reducing the batch size and make sure that each operation is completed much before 2000ms, so that a buffer time is available which would accommodate in case of peak usage hours. Let's say you have fixed the batch size so that it always takes 1600ms(an arbitary value), and though the same operation takes 1800 to 1900ms during the peak usage hours, it won't effect your performance.
  • Hi Janardhan, also please see if you have any feasibility of logical deletion rather than the hard delete options (ACTIVE_IN flag set to 'N'). hard deletes would be time consuming if you have any triggers to write to audit tables as well as if you are having any constraints dependent on them.