Appian Community
Site
Search
Sign In/Register
Site
Search
User
DISCUSS
LEARN
SUCCESS
SUPPORT
Documentation
AppMarket
More
Cancel
I'm looking for ...
State
Not Answered
+1
person also asked this
people also asked this
Replies
6 replies
Subscribers
8 subscribers
Views
5205 views
Users
0 members are here
Share
More
Cancel
Related Discussions
Home
»
Discussions
»
General
Automation of Bulk Upload of data
rishub
over 7 years ago
Hi All,
We need to upload a text file into AppianDB daily night that contains almost 60000 rows. I was thinking of going for a stored procedure for that but our will be on the Application server & the DB is present on another server. Can anyone guide a way how can I do it without creating an Appian process as I don't want to parse such large data into our system.
Thanks in Advance
Rishu
OriginalPostID-241889
Discussion posts and replies are publicly visible
0
rishub
over 7 years ago
Also, we can't access the servers as they are hosted by 3rd party.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
ryanh
over 7 years ago
Most databases have a way to bulk load/transform data as well as scheduling jobs. You could probably have your database run a job that checks a set directory for a certain file name and then parses the data, loading each row of data into the db in bulk using its native ETL/bulk loading processes. You would only use Appian to facilitate the user uploading the file and having it store a copy in a db accessible path on the server.
fyi, we have some apps that load CSV files nightly and it works ok for <100 rows of data, but one application loads hundreds of rows of data every 1-2 hours throughout the day. For each row we kick off a new process. Needless to say, starting a few hundred processes in quick succession with each process receiving a CDT of 20+ columns, running a few db queries, updating the CDT and then writing that to the datastore is NOT efficient. If we use subprocesses/MNI in sets of say 50 we have performance problems because we overload the engine (a parent process spawns subprocesses on the same engine). I am converting some of this to start processes via messages so that they run spread out on all of our execution engines/servers...hoping this helps stability & reduces issues and user interface problems for users on the same engine while loading data (code change goes live tonight).
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
rishub
over 7 years ago
Thanks for update ryanh. The only issue that I am facing for database job is that the file will be loaded on different server than the server where DB is. These servers are hosted by third party so we don't have access to them so I am not sure how the stored procedure will get file from remote server.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
JaiV
A Score Level 1
over 7 years ago
If you can convert your data file to SQL Insert statements then the most clean way to upload large data is using the Batch SQL Import plugin.
forum.appian.com/.../summary
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
rishub
over 7 years ago
Hey Jaiv, Thanks for response. I am going to get text statement, how can I change it to SQL Insert Statements?
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
sagarl511
A Score Level 2
over 7 years ago
@rishub - Adding to @ryanh comments - We did it similarly only we used stored procedure which takes input as xml (converted CSV parsed data to xml). This helped us in delegating some part of work to stored proc rather than using Appian OOTB write to data store entities. Another approach is to write a smart service which takes document id as parameter and parses it stores in db.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel