Hi,
Our client has an Appian process (triggered by a daily batch job) that exports all the data from around fifty tables.The plugin used for this is: "MySQL Dump To Document".Since we've upgraded to Appian 25.3, the plugin is now deprecated, and the latter process is failing in error.*** Error: "Error creating MySQL Dump to Appian Document"What new plugin alternative could you advise me to use and install ?Otherwise, which native Appian solution could you propose ?Our client needs to deploy a solution in Production in few days, so I have to find a solution.
Regards
Discussion posts and replies are publicly visible
Here is a partial example SQL script generated by the plugin (AS_DUMP 9_2_2025 4_00 AM GMT+00_00.sql):
CREATE DATABASE IF NOT EXISTS Appian USE Appian; -- ----TABLE Structure for: TT_MY_TABLE1------ DROP TABLE IF EXISTS `TT_MY_TABLE1`; CREATE TABLE `TT_MY_TABLE1` ( `id` int(11) NOT NULL AUTO_INCREMENT, `fk_case_id` int(11) DEFAULT NULL, `comments` varchar(1500) DEFAULT NULL, ... PRIMARY KEY (`id`), KEY ... ) ENGINE=InnoDB AUTO_INCREMENT=2610 DEFAULT CHARSET=utf8mb3 COLLATE=utf8mb3_general_ci; -- ----Dumping data for:TT_MY_TABLE1----- LOCK TABLES `TT_MY_TABLE1` WRITE; INSERT INTO TT_MY_TABLE1 VALUES ('1','2',NULL,NULL,...),('2','5',NULL,NULL,...),('3','6',NULL,NULL,...),... ; UNLOCK TABLES; -- ----TABLE Structure for: TT_MY_TABLE2------ DROP TABLE IF EXISTS `TT_MY_TABLE2`; CREATE TABLE `TT_MY_TABLE2` ( `id` int(11) NOT NULL AUTO_INCREMENT, `fk_case_id` int(11) DEFAULT NULL, `val` varchar(1500) DEFAULT NULL, ... PRIMARY KEY (`id`), KEY ... ) ENGINE=InnoDB AUTO_INCREMENT=2610 DEFAULT CHARSET=utf8mb3 COLLATE=utf8mb3_general_ci; -- ----Dumping data for:TT_MY_TABLE2----- LOCK TABLES `TT_MY_TABLE2` WRITE; INSERT INTO TT_MY_TABLE2 VALUES ('1','2',NULL,NULL,...),('2','5',NULL,NULL,...),('3','6',NULL,NULL,...),... ; UNLOCK TABLES;
For what purpose?
They use it for internal BI analysis.They were supposed to deploy the Appian application next Monday, but due to the new 25.3 release, they can no longer do so.
OK. And what about a CSV export? Or Excel files? We have plugins for this.
If the data was to be used in another system for Analytics or Reporting best approach is to setup EDP pipeline between Appian and the System. Here is the documentation to get started with docs.appian.com/.../Enhanced_Data_Pipeline_for_Appian_Cloud.html
I'm going to ask the question to the client...
Thank you, it is interesting. Is it easy to set up ?
It is to an extent as we don't need to code or configure much for this as the Appian Support Team sets up the Enhanced Data Pipeline (EDP). There are some pre-requisites you can see here which should be there for the setup to be done. Go through the documentation then raise a Support ticket with Appian and get the conversation/setup going if it suits! The aforementioned link contains steps to raise the Support ticket as well.
After having asked to the client: they make a LOAD in the another DB,If they can not use CSV, is there any way to easily generate INSERT SQL commands ?(I keep the Harsha solution too, to present all available solutions to the client)