#Stored Procedure#

Hi All,

Is there any data restriction in SP while retrieving a data from DB?

I am facing some sort of weird issue that When I executes SP from Appian end(Process modelling), it fails to execute intermittently . Out of 10 times, it executes at most once and remaining time it just skips the node and no data is getting saved(I am performing selection on single table and inserting onto another after manipulating few data). However When I executes from DB directly, it works fine as expected which shows that no logical issue.So How can I solve this issue?

Thanks in advance..!

Note:And also I would like to highlight that execution time of SP ranges from 40 min to 1 hour as I am dealing with huge amount of data.

  Discussion posts and replies are publicly visible

  • 0
    Certified Lead Developer
    Hi the execution time might be causing the issue here. Did you check the log, did you find anything related to timeout.

    Alternatively you can use MAX_EXECUTION_TIME = 3600000 (in milliseconds)

    But this works only for read only statements. Also this kills any query which exceeds the provided timeline.

    In your case, you need to change MySQL connect timeout to more than an hour. This should work for you.

    Hope this will help you.
  • As Alok mentioned, the execution time is likely the culprit.

    You best bet is to improve the performance of this procedure, or split out its functionality into multiple calls.
  • Hi melwyna,
    You can optimize the stored proc for better performance. Also you can use cursors if you need for looping if necessary.
    There might be multiple reasons for slowness,
    1. Indexes on target table - If there are indexes on target table, it might be the reason for slowness. So if you are updating other tables, you can think of dropping indexes before updation and recreate those after operation is done (This is useful if this is one time process OR weekly and happening outside of business hours)
    2. Huge data: If data is huge, it takes time to execute the stored proc. I dealt with 100k database rows and it took around 10-15 mins for execution. If you are handling huge data, even simple operation can impact performance. So you need to carefully design and use it. Use cursors for better performance.

    If you can provide more details about how much data you dealing with and what kind of manipulations you are doing, that might help us suggest better approach for this.

    Regards,
    Ashish.

  • Hello All,

    Thank you for all your responses.

    aloks176 / coltonb - I tried setting out the timing but still it fails. Because my SP execution time varies time to time(Probably an hour to 2). So I can't set any theoretical execution time. Moreover I can't improve the performance as its already optimized.

    ashishd423 -My target table doesn't contains any indexes apart from primary key(Which is default index). And I am dealing with min of 1L of data to store onto target table. And I am not doing any complex operation . I am joining a four tables and inserting onto a new table.

    And I have few queries about SP while we call from Appian
    1)How long a SP works? Is there any restriction in time line like it works for some time(Say 20 min) and beyond the time it skips?
    2)Will there be any chances that SP can be asynchronous? I mean just triggering an SP and skipping the node.
    3)Will the SP skips the execution when sql server memory is full?

    Thanks in advance
  • It would be an issue with Maximum Execution Time which dint solve your issue. If your SP is very dynamic then I believe it would be a good design principle.
  • Can you elaborate what exactly you are doing or trying to achieve ?
    How often are you running this model ?
    Is the store proc giving any output back to appian ?
    I hope there can be a solution out of appian for this and the load on appian could be removed.
  • 1)How long a SP works? Is there any restriction in time line like it works for some time(Say 20 min) and beyond the time it skips?

    - Certainly if SP takes more time, Stored Procedure node breaks and generates alert that it is not able to pull required resources to perform this operation. But the stored process still runs in backend. You can see if that procedure is running in backend by running query to see process list i.e. show processlist. This will help you identify is procedure is still running. It will look like below. Info section of the result will show if your SP is still running.

     

    2)Will there be any chances that SP can be asynchronous? I mean just triggering an SP and skipping the node.

    - I dont think we can call SP asynchronously.

    3)Will the SP skips the execution when sql server memory is full?

    - This is a rare case but high sql memory usage certainly affects the execution of SP.

    You can try below approach,

    As you said, you are joining four tables and inserting onto a new table.  Here, can you create new view which has the data combining 4 tables and use that view in your stored procedure. Also, if Stored procedure is taking a lot of time to execute, other option is to run stored procedure is batches e.g. run SP for 50k at a time into multiple batches. Batch size can vary dependent on the time taken for execution of SP.

    Let me know if this helps.

    ~ Ashish

  • +1
    Certified Lead Developer
    in reply to melwyna
    1) The SP is limited by the time out of Smart Service execution, iirc, this is 1 hour. This is different than your typical DB timeouts when using Query Rules/Entities, which is controlled by Server Configs: docs.appian.com/.../Post-Install_Configurations.html I am unsure if you can increase the timeout for a Smart Service Execution, but if you are in Cloud, you will likely need to raise a support ticket for this.

    2) You could run it asynchronously, but I do not believe this will help your time out problem. I believe to best achieve this, you should wrap your Execute SP node in a sub-process, and run the sub-process asynchronously. I would suggest that you have error handling in the sub-process, where the SP can return a Out Parameter for isSuccess, that can return false if you have to rollback in your transaction.

    3) I would believe this is likely.

    I very much agree with ashishd423. See if you can use a view and batching to complete your SP calls within an hour. I would also suggest that you run these jobs only off business hours, when the DB should have more resources. Additionally, as ashishd423 points out, your SP still runs in the DB. The smart service simply errors as it's reached an hour, and will no longer wait for a response, but does not send a kill signal. You could consider manipulating this to your advantage. In parallel you could launch a query with a buffer timer in a loop (as to not hit the db too much), which looks for a new record inserted at the end of your SP to know your Stored Procedure completed, and your data is there (if you need it downstream). Again, consider error handling, if you suspect that your SP should never take more than 2 hours, send an email to your Support Team if loop counter * buffer time > 2 hours.
  • Hello All,

    Thanks for all your responses/suggestions.

    As @ashishd423,@jamesm881 mentioned , I am facing an issue due to a timed out as the execution of SP crosses an hour. More over I am using Sql server and is there any way to look into an active processes?

    And is there any way to increase node timing to 2 hours for On-premise server(Just an alternate option)?

    Thanks in advance
  • 0
    Certified Lead Developer
    in reply to melwyna
    I am unsure where you could increase the timeout for Smart Service Nodes, though I believe it is possible. I'll speak around, and if anyone knows, I'll direct them here to update.

    I believe you can refer to below page on how to view current job activity for SQL Server.
    docs.microsoft.com/.../view-job-activity