"Import CSV to database v3" plugin not working correctly

My requirement is to import a csv document to database. The csv file has header field names with underscore (Eg. "first_name", "last_name"), but the database table column names don't have underscore in it. Thats why the data is not getting written to database successfully. Is there a way out here?

  Discussion posts and replies are publicly visible

  • +1
    Certified Lead Developer

    If you are using Import CSV to database plugin component, your csv field names and database table column names must be same. In case they are not same, you can use the plugin CSV to CDT Parser to first parse your CSV into a CDT and then use that CDT to write data to the database. If your CDT field names and table column names are different, you can map them in your CDT's xsd in the following way:

    <xsd:element name="first_name" nillable="true" type="xsd:string">
    <xsd:annotation>
    <xsd:appinfo source="appian.jpa">@Column(length=255, name="first_name")</xsd:appinfo>
    </xsd:annotation>
    </xsd:element>

    The highlighted part indicates the table column name and your column name can be different from your <xsd:element name> which is your CDT field name.

  • I would say it's simpler to make the change in the CSV file rather in the db. An alternative that has worked for me in the past is not including column headers in the csv file but number of parameters must match exactly.
  • Okay, my CDT field names and CSV header names are same, but I have more fields in CDT than the CSV file, and my CSV values are semicolon separated, not comma separated. I have used the CSV to CDT parser. But the CSV data are coming to PV.
  • Download latest version of  Excel Tools -Plugin and use the smart service import csv to database with no headers option ticked. Use all the column headers as per the database and make sure the number of columns in CSV and database are same.Sequence does not matter.

  • Hi Sudipta,
    could you please attach the sample CSV and DB table structure?
    Thanks.
  • Hi, Technically you cant use this plugin if the names doesn't match. I do remember building couple of solutions for a different requirement where user can upload file with columns on any order and names wont match the db column names.

    1. We let users map the columns(Known column names ) with the file headers(User provided) in a UI, and use parse csv to cdt plugin for dumping the data in table. ( This might be a solution to your requirement if the file will have less number of rows. But note that you will have to do some mapping based on column mapped by the user . Also this process instances will consume more memory if the file is huge and used frequently).

    2. Similar to the above approach (let users map the columns with the file headers in a UI ) and do rest of works on the DB end with TSQL . (Note this approach was for a requirement where the file had 20k+rows, upto 20 columns and very frequent file uploads upto 10k+ average on a day and appian is just orchestration tool here )
    a. Parse just the file header and call a stored proc to create a table with unique name in run time with the parsed column names .
    b. use the plugin import csv to database to dump the file data to table created.
    c. use another stored proc to move data and delete the table after verification.