Best Practices for bulk record updates?

What is the best practice for making bulk record updates?  An example is you have a table for a particular CDT which has say 10,000 rows and say 20 fields.  We need to be able to filter the rows and update values in that subset of rows.  Now throw in the fact that you need to be able to do this for multiple tables across multiple data stores (to minimize dev effort required to do this for all necessary tables).  We can not simply run a manual SQL in our production environment to update records without jumping through hoops for approval and involving many people (against policy).  My company wants a unified app that can make bulk updates to tables in any of our Appian data stores via SQL injection.  Should we:

- Create an app to inject parameterized SQL to any of our databases (hopefully not but this is the request).

- Use a third party ETL tool for bulk changes.

- Use a smart service (if any exists) to select a table, select a column you need to update, give the value you want in the column and the filter you want to update only certain rows.

- Export records into excel, let business make changes, and setup an excel import PM for any data entity.

- "You're doing it all wrong, this is how you should do this...."

What I have come up with as follows is:

Most of our apps have read only grids already where you can filter/view important CDT's.  I could modify existing read only grids to do the following:  Add a row of drop downs (by duplicating existing drop down filters) but instead of using them to filter, this second set of drop downs would be used to set the data in a certain field for all rows.

 

Then sending the updated rows to a write to data store PM would be easy.  What do yo all think?

Thanks!

  Discussion posts and replies are publicly visible

Parents Reply Children
No Data