Overview
Provides the following custom smart services and functions for handling data in Excel.
Key Features & Functionality
Smart Services included are:
Functions included are:
Peter Lewis April Schuppel @Appian AppMarket @Apppin Support Team @Appian Corporation
Read Excel File Paged (readexcelsheetpaging()) function is not reading excel file properly in one of the use case. Could you please review and fix this issue.
Use case and Steps to re-create:
Create a simple Excel file with two columns(Two Headers)
Add an simple Excel function to second column
The logic is, based on column 1 value conditionally populate second column value
Ex: =IF(A2:A100 = "", "-", IF(A2:A100<100, "TRUE","FALSE"))
Populate the data in column 1 for this Template using Appian process (You can used Export CDT to Excel or Export SQL to Excel)
Note: I know the above mentioned smart services are deprecated, but those are working fine (You may already know about this as your the owner of the plugin and As community team members mentioned)
Now use readexcelsheetpaging() to read the generated Excel file
It only returns the first column values, but the second column values coming as blank
Note1: Now download the same generated document from Appian to local desktop and open the file, you will see the column 1 and column 2 values properly.
Save the file and re-upload the file to Appian manually, now if we read the file using readexcelsheetpaging() it reads properly
Note 2: But if we use readexcelsheet function it is reading properly (But it is not recommend to use as it consumes to much system resources)
Hi, I have a use case that if the readexcelsheet function has a parameter for stopping at a certain row limit, it would be perfect. how did you work around this issue?
@manishk114 is this issue resolved. I had 57 columns previously, which was working fine. Now I added one column and getting the same error!!!
You need to set exactly 58 parameters on the prepared statement
Hi Team,
We are facing a strange issue with CSV to DB
Summary -Issue with CSV to Db Smart service - CSV is not getting uploaded with more than 48 columns Previously we had 48 columns in the csv , and it was working fine , Later we had a new requirement to add new column, after adding the 49 column it stops working and showing error as attached. We have done few digging and found that if we remove any one column ( either new one or any another one), and remove the same in column mapping in the column parameter of the csv to DB smart service it works fine . Until we keep the column count within 48 all works well .
While checking the documentation , I can see that we can see the generated logs of this plugin , Can you please help how to get the logs of its prepare statement.
I can see below error in tomcat logs-
ERROR ImportCsvToDatabase8 Error executing SQL java.sql.SQLSyntaxErrorException: (conn=3176834) You need to set exactly 48 parameters on the prepared statement at org.mariadb.jdbc.internal.util.exceptions.ExceptionFactory.createException(ExceptionFactory.java:62) at org.mariadb.jdbc.internal.util.exceptions.ExceptionFactory.create(ExceptionFactory.java:176) at org.mariadb.jdbc.ClientSidePreparedStatement.addBatch(ClientSidePreparedStatement.java:256) at sun.reflect.GeneratedMethodAccessor9526.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tomcat.jdbc.pool.StatementFacade$StatementProxy.invoke(StatementFacade.java:118) at com.sun.proxy.$Proxy218.addBatch(Unknown Source) at com.appiancorp.ps.exceltools.util.DatabaseUtils.addBatchInsert(DatabaseUtils.java:594) at com.appiancorp.ps.exceltools.util.CsvHelperUtils.importDataWithHeaderFromFile(CsvHelperUtils.java:181) at com.appiancorp.ps.exceltools.smartservice.ImportCsvToDatabase8.processTheRequest(ImportCsvToDatabase8.java:90) at com.appiancorp.ps.exceltools.smartservice.ImportCsvToDatabase8.run(ImportCsvToDatabase8.java:70) at com.appiancorp.process.runtime.framework.DefaultActivityExecutor$1.lambda$call$0(DefaultActivityExecutor.java:136) at com.appiancorp.plugins.PluginUsageLogger.runWithPluginInformation(PluginUsageLogger.java:54) at com.appiancorp.process.runtime.framework.DefaultActivityExecutor$1.call(DefaultActivityExecutor.java:136) at com.appiancorp.process.runtime.framework.DefaultActivityExecutor$1.call(DefaultActivityExecutor.java:130) at com.appiancorp.common.ContextClassLoaderSwitcher.runInContext(ContextClassLoaderSwitcher.java:27) at com.appiancorp.process.runtime.framework.DefaultActivityExecutor.execute(DefaultActivityExecutor.java:130) at com.appiancorp.process.engine.UnattendedJavaActivityRequest.execute(UnattendedJavaActivityRequest.java:88) at com.appiancorp.process.engine.UnattendedJavaActivityRequestResponseCreator.getJavaActivityResultResponse(UnattendedJavaActivityRequestResponseCreator.java:22) at com.appiancorp.process.engine.UnattendedJavaActivityRequest.getJavaActivityResultResponse(UnattendedJavaActivityRequest.java:67) at com.appiancorp.process.engine.UnattendedJavaActivityRequest.execute0(UnattendedJavaActivityRequest.java:55) at com.appiancorp.common.persistence.data.AppianDataUserContextProvider.executeWithConsistentDataReads(AppianDataUserContextProvider.java:143) at com.appiancorp.process.engine.ProcessContinuationRequest.executeOuter(ProcessContinuationRequest.java:68) at com.appiancorp.process.engine.ContinuationRequest.execute(ContinuationRequest.java:129) at com.appiancorp.process.workpoller.UnattendedRequestHandlerBean.onMessage(UnattendedRequestHandlerBean.java:36) at com.appiancorp.process.workpoller.WorkItem.run(WorkItem.java:110) at org.springframework.jca.work.SimpleTaskWorkManager$DelegatingWorkAdapter.run(SimpleTaskWorkManager.java:241) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750)
Hi ,
We are seeing issue with Import CSV to Database smart service if number of columns are more than 48 columns.
Do we have any limit in terms of number of columns which can import successfully.
Thanks
We are using Excel Tool v2.2.11. We want to handle a use case where User uploads unreadable / corrupted Excel file from UI using a!fileUploadField(). We are using readexcelsheet() function to read the file but it gives Expression Evaluation Error.
Please suggest how to catch this expression evaluation error thrown by readexcelsheet() function when trying to read unreadable/ corrupted Excel sheet. We want to delete this document (if its unreadable) & show an error message to user that the uploaded document is unreadable.
PFB screenshot showing the error thrown by readexcelsheet():
Any plans to support synced record types for the Import smart services?
Please see community thread here: community.appian.com/.../data-gets-updated-in-db-but-in-record-it-is-not-getting-synced
The Appian Healthcheck is flagging this plugin as high risk; "Excel Tools (com.appiancorp.ps.exceltools) references private and deprecated Appian APIs". Will an upcoming version of the plugin address these findings? Thanks!