Total Time taken for CSV to Database Import

Hello

We are using Appian 17.4 to bulk upload csv data to a sql server database. We are achieving it by using ‘Import CSV to Database’ smart service to import the data to the database.

One of the csv file has 6 columns and around 20k records. There is no data transformations.

Below 2 issues:

Access Denied Error:
While we upload the csv containing around 20k rows, the data is successfully inserted to the database. Though after successful upload, it shows ‘403 Access Denied’ error on the screen.

Total Upload time:
The time to upload the csv containing 5k rows at a time is taking around 5 minutes which seems to be a bit high. This is obvious with the reason that Appian is communicating to different server writing the data.

Is there any way or alternate to reduce the time taken to upload the data to the database?

 

Thanks

Nirav

  Discussion posts and replies are publicly visible

Parents
  • Thanks ryanh for the detailed information.
    Yea that seems to be the best approach. In that case, how is that we can pass the csv to the stored procedure? To store that in the KC? If so, how can the db get access to the file in KC?

    Thanks agian
  • I played around with this right after I posted that long email. (since I too haven't yet solved the issue)

    In my case, my source data is coming from a DB2 table and I am parsing the pipe delimited data and then using the parsed data to runs some select queries, if/then logic and insert/update into MS SQL. Since I don't really start with a CSV file and I didn't want to worry about security (loading a CSV into the database system or using the native 'bulk insert' capability), I chose to query the DB2 data using a query entity, then forcing the data into a big delimited PV variable and trying to change some of the delimiters that Appian adds (brackets, semicolons with a space afterwards).

    tostring(pv!db2datasubset.data.ReportData_Txt)
    substitute(pv!db2delimited,"; ", "@@@")

    I have a @@@ delimited array of sorts, with nested pipe delimited strings saved in my pv.

    I haven't yet passed that from Appian to a stored proc. But I played with making a function that would allow me to quickly parse a pipe delimited string. My hope was that I could create a stored procedure that would loop over a single varchar(max) variable, split the single large string into essentially an array of smaller rows of data. Then, in a loop, perform the select/if/then/insert logic we need on the parsed row data.

    There is a new string_split function built into MS SQL 2016. Unfortunately, we are still on 2012 (and running under the 2010 compatibility level) and there isn't a native splitting function to allow you to grab element X from a delimited list (let alone specifying the delimiter). I got slowed down looking online for fast/low overhead examples on stackexchange of splitting functions. There are some really interesting solutions to that, though some require CLR and installed executables versus native sql logic (or are slow).

    Back to your question:

    Pull the CSV file into a PV variable of type Text.
    I haven't yet called the sql function (or made a stored procedure) directly from Appian, but I assumed that I could pass in the string into a varchar(max) variable. the (max) isn't limited to the traditional 8,000 byte length that a varchar() is limited to.

    CREATE FUNCTION fn_Wordparser
    (
    @multiwordstring VARCHAR(max),
    @wordnumber int,
    @delimiter varchar(5)
    )
    returns VARCHAR(500)
    AS
Reply
  • I played around with this right after I posted that long email. (since I too haven't yet solved the issue)

    In my case, my source data is coming from a DB2 table and I am parsing the pipe delimited data and then using the parsed data to runs some select queries, if/then logic and insert/update into MS SQL. Since I don't really start with a CSV file and I didn't want to worry about security (loading a CSV into the database system or using the native 'bulk insert' capability), I chose to query the DB2 data using a query entity, then forcing the data into a big delimited PV variable and trying to change some of the delimiters that Appian adds (brackets, semicolons with a space afterwards).

    tostring(pv!db2datasubset.data.ReportData_Txt)
    substitute(pv!db2delimited,"; ", "@@@")

    I have a @@@ delimited array of sorts, with nested pipe delimited strings saved in my pv.

    I haven't yet passed that from Appian to a stored proc. But I played with making a function that would allow me to quickly parse a pipe delimited string. My hope was that I could create a stored procedure that would loop over a single varchar(max) variable, split the single large string into essentially an array of smaller rows of data. Then, in a loop, perform the select/if/then/insert logic we need on the parsed row data.

    There is a new string_split function built into MS SQL 2016. Unfortunately, we are still on 2012 (and running under the 2010 compatibility level) and there isn't a native splitting function to allow you to grab element X from a delimited list (let alone specifying the delimiter). I got slowed down looking online for fast/low overhead examples on stackexchange of splitting functions. There are some really interesting solutions to that, though some require CLR and installed executables versus native sql logic (or are slow).

    Back to your question:

    Pull the CSV file into a PV variable of type Text.
    I haven't yet called the sql function (or made a stored procedure) directly from Appian, but I assumed that I could pass in the string into a varchar(max) variable. the (max) isn't limited to the traditional 8,000 byte length that a varchar() is limited to.

    CREATE FUNCTION fn_Wordparser
    (
    @multiwordstring VARCHAR(max),
    @wordnumber int,
    @delimiter varchar(5)
    )
    returns VARCHAR(500)
    AS
Children
No Data