Hi All,
I wanted to try an approach to transfer the data from one of my table oracle DB to a table in SQL Server DB.
i tried the approach of using 2 Query Database SmartServices. One is to Select the values from the Oracle DB table(Query Database 1), and the other is to insert the value in SQL Data Base table(Query Database 2).
Oracle DB table contains the value below:
SQL STATEMENT USED in QUERY DATABASE SMART SERVICE: (Select ID, FIRSTNAME, LASTNAME, TITLE FROM EMPLOYEE)
After executing the process Model to load the same data to SQL Server table, SQL Server DB table becomes,
SQL STATEMENT USED in QUERY DATABASE SMART SERVICE: (INSERT INTO EMP VALUES (ac!ID, ac!FIRSTNAME, ac!LASTNAME, ac!TITLE)
The value are populating in the single row not in different row as in Oracle DB.
Can anyone help me on this?
Discussion posts and replies are publicly visible
I would suggest to store the data returned from Oracle in a pv then format the data in a CDT format depending on in what format data is returned from Oracle.Then write data from this CDT to the SQL DB. In other words, you need to insert data row wise rather than column wise.
There is a similar thread recently: https://community.appian.com/discussions/f/process/14765/process-variable-data-to-sql-database-table
I would check that your variables (pvs and acs) are listed as multiples. As others stated, CDTs might also be a good route.
Like I mentioned on the linked thread: I am curious about the volume of data you are dealing with and the frequency at which you want to perform this transfer. How much data are you talking about? And how often will your application be doing this operation? At large scale, alternative methods would be preferred over doing this transfer in memory (via process variables).
https://community.appian.com/w/the-appian-playbook/188/transferring-processing-large-data-sets-etl