How are you getting your Appian mySQL data (SaaS environment) to your internal data warehouse?

We need to get our captured data in Appian moved to our internal data warehouse for reporting, analytics and dashboards to be presented across platforms. We have our data in our processes captured in mySQL for performance reasons and do not want to write to the database as part of our process. Instead, we would like to get our data to our internal data warehouse on a nightly basis - at least to start. How are you completing this in your SaaS environment today? Are you bulk loading? Are you writing a process to do it? Are you creating files and sending via sFTP? Are you using web services for the data warehouse to pull? Any other creative solutions? Tricks, tips and other info would be much appreciated.

OriginalPostID-262657

  Discussion posts and replies are publicly visible

Parents
  • If you are in one of our recent versions, have you considered Web API's? This might give you a more easy way of doing an export. Having said that, if you are doing a huge volume data dumps, then I suggest you export your MySql data in a file format directly from Appian MySql Admin Console and let the WareHouse consume this info on a periodic basis. If you want real time, then you need to consider pushing this info out every time you commit info to the DB. This is not the best approach for a DataWareHouse based solution.
Reply
  • If you are in one of our recent versions, have you considered Web API's? This might give you a more easy way of doing an export. Having said that, if you are doing a huge volume data dumps, then I suggest you export your MySql data in a file format directly from Appian MySql Admin Console and let the WareHouse consume this info on a periodic basis. If you want real time, then you need to consider pushing this info out every time you commit info to the DB. This is not the best approach for a DataWareHouse based solution.
Children
No Data