I have a csv file with delimiter '|'. When I try to run the smart service by passing a csv file where few fields are null, it is returning the error message 'Error executing SQL'. But when I replace the null with random text values it is writing the data to DB.
Is this an expected behavior of this smart service?
Is there any way to import or read a csv file with null values?
Discussion posts and replies are publicly visible
Are the null values separated by your defined delimiter?
Hi,
Yes, it works with nulls only if the database columns allow null values. If columns are NOT NULL, it fails with "Error executing SQL." Update DB schema or preprocess the CSV to fix it.
The best approach is definitely to verify the database schema, as Srinivas mentioned. If the columns are defined as NOT NULL, then inserting null or empty values will cause SQL errors — which seems to be exactly what's happening in your case.
NOT NULL
If modifying the database to allow nulls is not an option, there is a workaround:
You can use the readcsvlog() function to read the CSV file into a CDT. Once the data is loaded, you can loop through the CDT in Appian, check for null or empty values, and replace them with default values (e.g., "N/A" for text, 0 for numbers). After that, you can insert the cleaned data into the database using Write to Data Store Entity.
readcsvlog()
"N/A"
0
Write to Data Store Entity
However, this workaround is significantly more complex and less efficient than simply updating the database schema to allow nulls in the appropriate columns. If possible, adjusting the DB structure is the cleaner and more maintainable solution
Yes, the null values are separated by the delimiter
But I noticed today that if the 1st column in the CSV file is empty then it is throwing this error, if other columns in between are null, it is able to read and write NULL to the DB.
readcsvlog() function only allows to read from log files. We won't be able to read from csv document.
You're absolutely right — I responded too quickly without thinking it through.
The readcsvlog() function is specifically designed to read log files, not general CSV documents. It only works with the application server log files, not with user-uploaded CSV files stored as Appian documents.
Thanks for pointing that out! I appreciate the clarification.
I just tested this on my side and it works correctly even when the first column is empty — the data is read and inserted as null into the database without any errors.
null
Are you sure the column mapping is correct in your configuration? It might be worth double-checking that columns in your CSV matches the expected structure and that the target table and allows nulls where needed.
Would you mind sharing a sample of the CSV file that’s causing the issue, along with the table script you're using?
That would help a lot to reproduce the problem and see what might be going wrong with the first column specifically.
Thanks!
Realized I'm using an old version of the plugin, I will update to the latest version of the plugin and give it a try.
As an alternative, you could try using the previewCSVForImport() function from the Excel Tools plugin. It allows you to read CSV documents uploaded to Appian. Example:
previewCSVForImport()
previewCSVForImport( document: todocument(ri!csvFile), delimiter: ";", charset: "UTF-8", numberOfRows: 1000 )