What is the best practice for deploying tables during an application package import to a new environment?

Hi All,

I wanted to get your views on the best way to import the tables associated with an application package in Appian when we move the application package from one source environment to a new target environment. As of the practice is that the administrators ask the developers to provide them with the application package and the associated SQL scripts for the application. Then the administrators first import the package on the target environment and then manually run the scripts in the cloud database’s SQL view.

Another school of thought is that the administrators should import the package and then go into Appian Designer navigate to the newly deployed package’s Data Source Entity and under Schema Management click on the “Verify” button to check if the tables exist if not create the tables to add the Data Entities on the cloud database?

Which of the above two approaches is the ideal?

Thanks & Regards,

Sam

  Discussion posts and replies are publicly visible

Parents
  • Certified Senior Developer

    Hi Sam, 

    In general, if you have the SQL scripts for tables/views definition, the best approach would be to run the given SQL scripts first in the target environment and then import an application package contains CDTs and Data Store. In this way the application package will be imported without any warnings and you don't need to verify and publish the data store once the package is deployed.

    Hope this answers your question.

    Cheers!

  • Hi Naresh, thanks for your response.

    1. With your approach we are assuming that the SQL's would always be correct and in case a wrong SQL is is run, the only way we will find out is when the application throws an error during run-time.

    2. Also when we deploy a package on a new environment from what i have seen so far we never get a warning for tables not existing in the database.

Reply
  • Hi Naresh, thanks for your response.

    1. With your approach we are assuming that the SQL's would always be correct and in case a wrong SQL is is run, the only way we will find out is when the application throws an error during run-time.

    2. Also when we deploy a package on a new environment from what i have seen so far we never get a warning for tables not existing in the database.

Children
  • Hi Naresh,

    1.You just verify the data store, in that case you will get to know whether all tables are exist or not .

    2. While deploying application it wont check for the database schema and CDT match, it will check only for the all objects and its dependency , while publishing Data store you will get a warning message.

    Thanks,

    Yash

  • Agreed Yash. So the best way would be post the import to go to the Data Source Entity and Verify/Publish & Save to ensure the tables are created correctly and Appian will refresh the CDT as and when changes are made to the same.

  • Certified Senior Developer
    in reply to Sam

    Hi Sam,

    1. Yes, the script has to be correct and to make sure that you can take the DDL scripts from Dev environment for each Data Store using Download DDL Script link available there. The approach I mentioned includes data base views and their definition should be created through SQL scripts in the target environment before publishing a data store.

    2. I'm not sure if you have ever come across the warnings as shown in the below snapshot when you deploy the package with data store and the table definitions are still missing in the target environment.

    Cheers!

  • Agreed Naresh. Quite honestly i haven't got a warning on the DSE import so far. Will look out for it the next time i do an import. Thanks for the screenshot. It will be helpful to do a comparison.

    But nett nett..there is no explicitly stated need to deploy the DB scripts manually on the cloud database.

  • With your approach we are assuming that the SQL's would always be correct

    Developers should write the SQL scripts first and use those to do all table creations / modifications / etc in the Dev environment.  Subsequent modifications (before deployment) should be made to those source SQL scripts in a way that accounts for the scripts needing to be run fresh in a new environment.  In this way, the chance of serious mismatches between the SQL scripts and the data types is reduced to approaching zero.  I've been through entire epics of sprints with large deployments on a 5-year-old production application with no serious DB mismatch issues when promoting to any of our 4 environments following this practice.

  • Yes Mike. This is the best way to paka yoke the process. However if we use a DSE to make Appian deploy the tables on the cloud db the possibility of any mismatches is completely eliminated as the entire table creation activity is handled by Appian based on the most current CDTs available on the application package. This becomes a much safer approach. What do you think?