Error uploading data using Import CSV to Database Smart Service

Certified Senior Developer

We are using Appian 18.1 to bulk upload csv data to a sql server database. We are achieving it by using ‘Import CSV to Database’ smart service, exceltools plugin version 2.0.2(recently upgraded) to import the data to the database.

One of the csv file has 6 columns and around 30k records. There is no data transformations.

 

After around ~5 mins of upload, it throws with an exception on the Appian Interface with below error(data is though inserted in the database after some time):

"403 Access Denied: you dont have permission to view this page"

Did anyone faced same issue and possible workaround?

  Discussion posts and replies are publicly visible

  • Hi Dev,

    These type of errors can occur only when there is a problem with executing the node or something it is taking user to some page where the user does not have access to it. If you can attach actual logs in the background then we can figure out what went wrong.

    Thanks
  • I Agree with raghavendrar0001, 403 Error caused when you are trying access a content where you don't have access for the same.

  • Can you describe the scenario, please? If the user uploads the file, I would recommend doing the insert asynchronously and informing the user when it is done, rather than have them wait for a few minutes. They can move on with their lives.
  • 0
    Certified Senior Developer
    in reply to Shanmukha
    Thanks for your reply. Unfortunately there are no specific logs for the errors i can see.

    Note that there is no exception executing the process nor does any node fails. When the error is occurred, the model is still running in the backend uploading the data without any error. It is then completed after some time with all data insertion in the DB and obviously with no exception in logs nor any failure in the model.
  • 0
    Certified Senior Developer
    in reply to Mike Cichy
    Hi Mike,

    There are few tables that needs to be inserted in the db, most of them are dependent on each other with referencing. Hence we do not want a scenario where user inserts the data for all tables(async), and most of them failing due to constraints violations in the DB.

    Thanks
    ND
  • Hi Dave,

    When we designed similar application where thousands of rows needs to be validated , and multiple tables needs to be updated sequentially. We picked up asynchronous approach.
    1.User uploads the file and the process starts
    2.Validate the file format on success inform users that they will be notified via email about the status of the request.
    3. Appian process Orchestrates the rest of process .i.e. Import CSV data to db , validating the data , writing the results in another table, etc., just by executing Stored procedures from appian process . Data is held at the DB level with unique identifiers for each transactions.
    4. users receive Emails with results .
  • 0
    Certified Senior Developer
    in reply to mohamedt808
    Thanks for responding. I have 10 different csv files, all to be inserted(not update) to each respective table. So no transformation or manipulations or validations.

    1. Did you use different database and not MYSQL? I need to have the data in SQL server.
    2. We have constraints in the tables. Did you had any constraints? eg. Table2 is id is a FK to the Table1 PK. which means no insertion can happen(async scenario) in Table2 until all data in Table1 is inserted
    3. If answer to 1 is Yes, How did you passed data from Appian to Stored Proc?

    Thanks
    Nirav
  • Hi Dave,

    1.We had used SQL Server.
    2.We didn't had such constraints .

    But, the approach we followed might be helpful, hence sharing it .

    1.We used Transact SQL in Stored proc to create a unique table at run time for importing the file data ,where table name and column names are passed from Appian to DB via Stored proc.
    2.Once table is created pass the table name to this plugin and import data from CSV
    3.For each transaction Table name and a unique identifier (GUID ID ) is held in separate table for reference and maintenance
    4.So appian know the table name and uses it in further stored proc to move data from these temp table to actual table with unique identifier for each file specific data. So Data for a specific file load are uniquely identifiable with a unique ID that Appian had passed .
  • 0
    Certified Senior Developer
    in reply to daven
    Did anyone faced similar issue or any solution to this?
  • 0
    Certified Lead Developer
    in reply to daven
    I have the same issue with 18.2 using the excel tools. If the process can complete in less than 5 minutes, I do not receive the error, but if the file is larger as soon as the 5 minute mark hits, I receive the 403 access denied error. I'm guessing the web server is timing out. So far I have not found a solution except to completely change to an asynchronous strategy. Can anyone confirm that the cloud web servers have a 5 minute timeout?