Bulk Upload of Data into Appian DB(Dealing with Lakhs of Records)

Hi All,

I would be getting data from External system in an Excel sheet where I need to insert all the data into Appian DB

OriginalPostID-231862

  Discussion posts and replies are publicly visible

  • 0
    A Score Level 2
    in reply to hiteshd
    HI,

    loading records more than 5000 is always a challenge which will effect the performance . Below points could be useful.

    1. Please consider Import CSV to database as it does not required any data in CDT.
    2.Do not do any data manipulation in Appian while loading the data from CSV file.
    3. Do not consider data insert more than 5000 record , else it will show up in health check report as high impact.
    5.Consider data upload in batch if you have more data like millions of record .
    6.Consider loading high data using export feature of DB.
    7.Consider data loading during off hours .
    8.Use any external feature to load a data.
    9.If you have any RPA tool use that to load data .
    10.Consider the upload of such a large set of data monthly /weekly.

    Please let us know what approach you have taken , as I also have faced the same scenario (40 K rows)and it always come as high risk in health check ,So we use RPA tool BluePrism in where we send mail with csv attachment to the blueprism and it will download the file and load the data in the Db.

    Thanks

    Manish
  • 0
    Certified Lead Developer
    Hi 2 use cases here:
    1. When You want to deploy these scripts manually into DB: I would like to go for generating SQL scripts online via existing service providers as mention by @brettf because this will be much quicker than doing it via Appian or other tools. But it's having a limitation I.e.: a user who is aware of these will only be able to dump the data into db (if it's a continuous deployment, such as weekly, monthly etc..) which means a business user won't be able to do this unless we (developer) train them.

    2. When you want to deploy these scripts via interface: in this case you have various options available, where you browse the file and upload the rows into DB in following ways:
    -- Using OOTB Smart Service
    -- Using custom smart service such as excel tools.
    -- using robotics tool such as Blue Prism

    Limitation: if we use, smart services, we must not perform this operation specially during business hours and archive the process immediately once after successfully completion. I recommend, try processing it during weekends to avoid the impact on server.

    If we use Blue Prism, upload will be done asynchronously, hence we cannot have assurance whether these data will be effected/inserted immediately (before next working days, so that business can use these data). Also you or your organisation need to baer the licensing cost of these robotics tool

    Hope you will be able to choose the best approach for your use case.
  • 0
    A Score Level 2
    in reply to aloks0189
    HI Alok ,

    Just to correct a bit ,If you are using Blueprism , you can immediately show the response to business (with in few min ) through a webservice in Appian or via a mail .

    Ragards

    Manish
  • 0
    Certified Lead Developer
    in reply to manishk0001

    Hi Manish thanks for your response, just a bit focus on this point i.e. asynchronous call, as per the documentation 

    Section: rule!BP_startProcess()

    They have clearly mentioned that, synchronous call can easily be timeout because Blue Prism is expected to perform heavy operation. Hence its always a best practice to invoke Blue Prism flow asynchronously.

    When it comes to , availability or querying about the latest data, that can only happen when the data is available in Appian DB, irrespective of, whether we make a service call. Also let's assume I configured an email, that will also trigger only when the dump is completed, because we will invoke Blue Prism flow asynchronously as per the best practices.

  • 0
    A Score Level 2
    in reply to aloks0189
    Hi Alok ,

    I am agree with your above point ,I just wanted to say that on your point of assurance of data can be done on only next day .
    Blue prism can send the processing status to Appian within few min and user can see it in Appian .The SLA will be just few min and user will come to know that the Blueprism processed the data successfully or failed.

    Cheers

    Manish
  • 0
    A Score Level 1
    in reply to hiteshd
    Hi Hitesh,

    Thanks for the response, will the smart service can be used for uploading 10K records or it has limitations in terms no of records which can be uploaded in a single stretch.


    Regards,
    Balaji.R
  • 0
    Certified Lead Developer
    in reply to rp_balaji
    Hi,

    10K should not be a problem. I have tried 40K rows of data with 10 columns each at one shot without any issues with Appian MySQL on Cloud.

    You will have to give a try on your environment to see how much time it takes.

    Thanks.
    Hitesh
  • 0
    A Score Level 1
    in reply to hiteshd
    Hi hiteshd,

    Thanks, ill give a try for the same.

    Regards,
    Balaji.R
  • Bulk uploading of data into Appian databases can be a complex process, especially when dealing with large amounts of records, such as hundreds of thousands or even millions. Here are some tips for a successful bulk upload:

    1. Prepare the Data: Make sure the data is in the correct format and free of errors before uploading. This may involve cleaning, transforming, and verifying the data.

    2. Choose the Right Tool: Appian provides several options for bulk data uploads, including CSV imports, the REST API, and the Appian Designer. Choose the one that best fits your requirements and technical capabilities.

    3. Performance Considerations: When uploading large amounts of data, it's important to consider the performance impact on the system. This may include optimizing the data format, batching the upload process, and increasing hardware resources if necessary.

    4. Monitoring and Error Handling: During the bulk sms upload process, it's important to monitor the progress and handle any errors that may occur. This may involve logging the upload process, checking for duplicates, and correcting any invalid data.

    5. Testing: Before uploading the data to the production environment, it's important to test the bulk upload process in a test environment to ensure that everything works as expected.

    By following these tips, you can help ensure a successful bulk upload of data into Appian databases. Additionally, if you encounter any issues during the process, it may be helpful to seek assistance from Appian support or an experienced Appian consultant