Appian Community
Site
Search
Sign In/Register
Site
Search
User
DISCUSS
LEARN
SUCCESS
SUPPORT
Documentation
AppMarket
More
Cancel
I'm looking for ...
State
Suggested Answer
+1
person also asked this
people also asked this
Replies
20 replies
Answers
4 answers
Subscribers
11 subscribers
Views
19873 views
Users
0 members are here
Share
More
Cancel
Related Discussions
Home
»
Discussions
»
General
Bulk Upload of Data into Appian DB(Dealing with Lakhs of Records)
harshav
over 8 years ago
Hi All,
I would be getting data from External system in an Excel sheet where I need to insert all the data into Appian DB
OriginalPostID-231862
Discussion posts and replies are publicly visible
Top Replies
brettf
over 8 years ago
+1
To tag onto colton's post above, I would recommend using an external tool to generate the SQL insert statements. The website sqlizer.io can take in an excel file and generate the code, as long as the excel…
Parents
0
harshav
over 8 years ago
Is there any easy way where I can insert the data without affecting the Performance of the Server, and also I know that this would be done in batches and also in Off hours, Just wanted to check if there is any smart services or easy way to do this.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
hiteshd
Certified Lead Developer
over 6 years ago
in reply to
harshav
Hi,
You can also look for 'Import CSV to Database' Smart Service if you are fine to have a CSV file but this smart service has its own limitations as well so ultimately depends on your use case. It is pretty fast.
Thanks.
Hitesh
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
manishk0001
A Score Level 2
over 6 years ago
in reply to
hiteshd
HI,
loading records more than 5000 is always a challenge which will effect the performance . Below points could be useful.
1. Please consider Import CSV to database as it does not required any data in CDT.
2.Do not do any data manipulation in Appian while loading the data from CSV file.
3. Do not consider data insert more than 5000 record , else it will show up in health check report as high impact.
5.Consider data upload in batch if you have more data like millions of record .
6.Consider loading high data using export feature of DB.
7.Consider data loading during off hours .
8.Use any external feature to load a data.
9.If you have any RPA tool use that to load data .
10.Consider the upload of such a large set of data monthly /weekly.
Please let us know what approach you have taken , as I also have faced the same scenario (40 K rows)and it always come as high risk in health check ,So we use RPA tool BluePrism in where we send mail with csv attachment to the blueprism and it will download the file and load the data in the Db.
Thanks
Manish
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Reject Answer
Cancel
0
rp_balaji
A Score Level 1
over 6 years ago
in reply to
hiteshd
Hi Hitesh,
Thanks for the response, will the smart service can be used for uploading 10K records or it has limitations in terms no of records which can be uploaded in a single stretch.
Regards,
Balaji.R
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
Reply
0
rp_balaji
A Score Level 1
over 6 years ago
in reply to
hiteshd
Hi Hitesh,
Thanks for the response, will the smart service can be used for uploading 10K records or it has limitations in terms no of records which can be uploaded in a single stretch.
Regards,
Balaji.R
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
Children
0
hiteshd
Certified Lead Developer
over 6 years ago
in reply to
rp_balaji
Hi,
10K should not be a problem. I have tried 40K rows of data with 10 columns each at one shot without any issues with Appian MySQL on Cloud.
You will have to give a try on your environment to see how much time it takes.
Thanks.
Hitesh
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
rp_balaji
A Score Level 1
over 6 years ago
in reply to
hiteshd
Hi hiteshd,
Thanks, ill give a try for the same.
Regards,
Balaji.R
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel