Export SQL to Flat File (CSV format) - can we have column values in the CSV file with double quotes?

Hello, 

We need to export all table and views from Database to CSV file through Appian process. The environment is setup in Appian cloud. The column data should be enclosed in double quotes and comma separated - like in example below:

"column header1","column header2"

"col1-value1","col2-sample-value2"

"col1-value2","col2-sample-value2"

We are using Export SQL to Flat File smart service to this. The headers come out with double quotes, however, the values are not enclosed in double quotes. The whole process is automated, as we do not have SQL script built do do this. We use information_schema table of MYSQL to find the tables and then use "Select *" to build the query. Can we get help on how to enclose double quotes on the column values?


Thank you,
Harish

  Discussion posts and replies are publicly visible

  • Harish, this is not currently possible in the node. Looking at the source code, the headers are written with the quotes and values are not. I would suggest a different delimiter if the values can contain commas.
    Another option is you can edit the plugin to add a boolean parameter to signify whether the values are to be enclosed or not. The source code is included in the plugin file.
  • Hello Harish,

    I have one question, who is reading the target file? excel? in taht case this implementation will be enough, but if something else is reading it please keep reading. 

    I was reviewing the code and I found that they are returning the value if the column is string/varchar "return StringEscapeUtils.escapeCsv(string)". based on the documentation you will find.

    Returns a String value for a CSV column enclosed in double quotes, if required.
    If the value contains a comma, newline or double quote, then the String value is returned enclosed in double quotes.
    Any double quote characters in the value are escaped with another double quote.
    If the value does not contain a comma, newline or double quote, then the String value is returned unchanged.

    Reference: https://commons.apache.org/proper/commons-lang/javadocs/api-3.1/org/apache/commons/lang3/StringEscapeUtils.html#escapeCsv(java.lang.String) 

    For example, if you do something like this in the query 

    ="select concat( '""',concat(firstname,'""')) as firstname ,lastname from test.people;"

    You will get in your file

    """Harish""", """LastName"""

     

    If you really need the values to be quoted you will need your target system (the one which reads the csv to do something special with this). 

     

    Hope this helps

    Jose Perez

  • 0
    Appian Employee
    in reply to josep
    Jose, you are correct, I missed the escape statement.
  • np, I was answering and didn’t realized you already answered, any way, the big question is if he really needs the quotes everywhere for example in numbers, if he does maybe the best option is to write his own version of the plugin.
  • Thanks Joe and Mike for taking to reply.

    The target system is a data warehouse tool, which we are not aware of. Their requirement is to have double quotes irrespective of the type of value i.e. double quotes to be present for numbers, nulls, strings,etc.

    As you correctly mentioned, we tried the query with concat of double quotes and returned the reply as you mentioned - with starting and ending with 3 double quotes.

    We are thinking of implementing this in either of the way -
    1. have a new plugin
    2. create a view - which will help us get this out.

    Thank you,
    Harish
  • Hi Jose,

    We were testing the plugin and found that the double quotes are not added even when a comma is present in the code. Could you think of any reason for the same?

    Thank you,
    Harish
  • Sorry for my late response you maybe already solved the problem.
    What I saw when testing this plugin was the behavior i was commenting previously, since it was in the code the only reason I can think:
    The way the string is mapped to the csv is straight forward unless you are using a really large string in he database . But I think that is not your case, So can you explain a little more your scenario, or is it was solved please can you post the answer in case someone else needs it?
    One possible option is that we are using different versions of the plugins.

    Jose