Excel Tools

Overview

Provides the following custom smart services and functions for handling data in Excel.

Key Features & Functionality

Smart Services included are:

  • Merge Excel Documents
  • Import CSV to Database
  • Import Excel to Database
  • Load CSV to Database
  • Convert HTML to CSV  
  • Convert Excel to CSV
  • Encrypt Excel with Password

Functions included are:

  • Read Excel File Paged
  • Read Excel Cell by Name
  • Read Excel Cell by Number
  • Query Appian Logs
  • Export Datasubset to Base64
  • Validate Document Headers
Anonymous
  • @7 where did you get the latest version 2.2.12? Or could you tell what version you are referring to: plugin version? If so the app market latest is version is 2.2.11.

  • Hi @dmitry.bocharnikov - We have updated the plugin codebase to enable "Import CSV to database" to work for PostGre SQL as well and it is now working successfully. We have the codebase version (2.2.12) with us. In order to avoid any future conflicts in upcoming versions from your side, we are happy to share the particular class file ("DatabaseUtils.java") that we have changed. You can merge this change in your version to use as the reference to work on the next one as required.

    Find below the updated DatabaseUtils java file:

    package com.appiancorp.ps.exceltools.util;
    
    import java.io.BufferedWriter;
    import java.io.File;
    import java.io.IOException;
    import java.nio.charset.StandardCharsets;
    import java.nio.file.Files;
    import java.nio.file.Paths;
    import java.sql.*;
    import java.text.NumberFormat;
    import java.text.ParseException;
    import java.text.SimpleDateFormat;
    import java.time.LocalDateTime;
    import java.time.format.DateTimeFormatter;
    import java.time.format.DateTimeParseException;
    import java.util.Date;
    
    import javax.naming.Context;
    import javax.naming.NamingException;
    import javax.sql.DataSource;
    
    import org.apache.commons.lang.StringEscapeUtils;
    import org.apache.commons.lang.StringUtils;
    import org.apache.log4j.Logger;
    
    import com.appiancorp.suiteapi.common.Constants;
    import com.appiancorp.suiteapi.common.ResultPage;
    import com.appiancorp.suiteapi.content.Approval;
    import com.appiancorp.suiteapi.content.ContentConstants;
    import com.appiancorp.suiteapi.content.ContentFilter;
    import com.appiancorp.suiteapi.content.ContentService;
    import com.appiancorp.suiteapi.encryption.EncryptionService;
    import com.appiancorp.suiteapi.knowledge.Document;
    import com.appiancorp.suiteapi.type.AppianType;
    import com.appiancorp.suiteapi.type.TypedValue;
    
    public class DatabaseUtils {
    
      private static final Logger LOG = Logger.getLogger(DatabaseUtils.class);
      private static final int TYPE_ORACLE_CURSOR = -10;
    
      public enum DatabaseType {
    	  POSTGRE("postgre", "\"", "\""), ORACLE("oracle", "\"", "\""), MSSQL("microsoft", "[", "]"), MYSQL("mysql", "`", "`"), DB2("db2", "", ""), MARIADB("mariadb", "`", "`");
    
        private final String name;
        private final String startQuoteSymbol;
        private final String endQuoteSymbol;
    
        private DatabaseType(String name, String startQuoteSymbol, String endQuoteSymbol) {
          this.name = name;
          this.startQuoteSymbol = startQuoteSymbol;
          this.endQuoteSymbol = endQuoteSymbol;
        }
    
        public Boolean matches(String databaseName) {
          return databaseName.toLowerCase().contains(this.name);
        }
      }
    
      public static Connection getConnection(String dataSourceName, Context ctx_) throws NamingException, SQLException {
        if (LOG.isDebugEnabled()) {
          LOG.debug("Getting Connection for Datasource name:" + dataSourceName);
        }
        DataSource ds = (DataSource) ctx_.lookup(dataSourceName);
        return ds.getConnection();
      }
    
      public static String getSelectSQL(DatabaseType dbType, String tableName) {
        StringBuffer sql = new StringBuffer("SELECT * FROM ");
    
        appendTableNameWithQuotes(dbType, sql, tableName);
    
        sql.append(" WHERE 1=0 ");
        if (LOG.isDebugEnabled()) {
          LOG.debug("Generated Select SQL: " + sql);
        }
        return sql.toString();
      }
    
      public static String getSelectSQL(DatabaseType dbType, String tableName, String[] columns) {
        StringBuffer sql = new StringBuffer("select ");
        sql.append(dbType.startQuoteSymbol);
        sql.append(StringUtils.join(columns, dbType.endQuoteSymbol + ", " + dbType.startQuoteSymbol));
        sql.append(dbType.endQuoteSymbol);
        sql.append(" from ");
    
        appendTableNameWithQuotes(dbType, sql, tableName);
    
        sql.append(" where 1=0 ");
        if (LOG.isDebugEnabled()) {
          LOG.debug("Generated Select SQL: " + sql);
        }
        return sql.toString();
      }
    
      public static String[] getColumnNames(Connection conn, DatabaseType dbType, String tableName)
        throws SQLException {
        PreparedStatement ps = null;
        ResultSet rs = null;
        DatabaseMetaData dbMetadata = conn.getMetaData();
        try {
          if (DatabaseType.MYSQL.matches(dbMetadata.getDatabaseProductName())) {
            ps = conn.prepareStatement(getSelectSQL(dbType, tableName),
              java.sql.ResultSet.TYPE_FORWARD_ONLY, java.sql.ResultSet.CONCUR_READ_ONLY);
            ps.setFetchSize(Integer.MIN_VALUE);
          } else {
            ps = conn.prepareStatement(getSelectSQL(dbType, tableName));
          }
          rs = ps.executeQuery();
          ResultSetMetaData rsmd = rs.getMetaData();
          int columnCount = rsmd.getColumnCount();
    
          String[] columnNames = new String[columnCount];
    
          for (int i = 0; i < columnCount; i++) {
            String colname = rsmd.getColumnName(i + 1).toLowerCase();
            columnNames[i] = colname;
          }
          return columnNames;
        } finally {
          try {
            if (rs != null) {
              rs.close();
            }
          } catch (Exception e) {
          }
          try {
            if (ps != null) {
              ps.close();
            }
          } catch (Exception e) {
          }
        }
      }
    
      /* The csvColumnToDBColumn contains indexes of the column names stored in the "columns" array that matched to the database */
      public static int[] getColumnTypes(Connection conn, DatabaseType dbType, String tableName, String[] columns)
        throws SQLException {
        PreparedStatement ps = null;
        ResultSet rs = null;
        DatabaseMetaData dbMetadata = conn.getMetaData();
        if (columns == null || columns.length == 0) {
          throw new IllegalArgumentException("None of the columns from the CSV file matched the database table \"" + tableName + "\"");
        }
    
        try {
          if (DatabaseType.MYSQL.matches(dbMetadata.getDatabaseProductName())) {
    
            ps = conn.prepareStatement(getSelectSQL(dbType, tableName, columns),
              java.sql.ResultSet.TYPE_FORWARD_ONLY, java.sql.ResultSet.CONCUR_READ_ONLY);
            ps.setFetchSize(Integer.MIN_VALUE);
          } else {
            ps = conn.prepareStatement(getSelectSQL(dbType, tableName, columns));
          }
          rs = ps.executeQuery();
          ResultSetMetaData rsmd = rs.getMetaData();
    
          int[] columnTypes = new int[columns.length];
          for (int i = 0; i < columns.length; i++) {
            int coltype = rsmd.getColumnType(i + 1);
            columnTypes[i] = coltype;
          }
          return columnTypes;
        } finally {
          try {
            if (rs != null) {
              rs.close();
            }
          } catch (Exception e) {
          }
          try {
            if (ps != null) {
              ps.close();
            }
          } catch (Exception e) {
          }
        }
      }
    
      private static String getFormattedValue(ResultSet rs, int column, SimpleDateFormat dateFormat,
        SimpleDateFormat dateTimeFormat, SimpleDateFormat timeFormat) throws SQLException {
    
        int coltype = rs.getMetaData().getColumnType(column);
        Date d = null;
        switch (coltype) {
        case Types.BIT:
        case Types.BOOLEAN:
          boolean b = rs.getBoolean(column);
          if (rs.wasNull()) {
            return "";
          } else {
            return b ? "1" : "0";
          }
        case Types.DATE:
          d = rs.getDate(column);
          if (d == null || rs.wasNull()) {
            return "";
          } else {
            return dateFormat.format(d);
          }
        case Types.TIMESTAMP:
          d = rs.getTimestamp(column);
          if (d == null || rs.wasNull()) {
            return "";
          } else {
            return dateTimeFormat.format(d);
          }
        case Types.TIME:
          d = rs.getTime(column);
          if (d == null || rs.wasNull()) {
            return "";
          } else {
            return timeFormat.format(d);
          }
        case Types.DOUBLE:
        case Types.FLOAT:
        case Types.DECIMAL:
        case Types.NUMERIC:
        case Types.REAL:
          double doub = rs.getDouble(column);
          if (rs.wasNull()) {
            return "";
          } else {
            NumberFormat f = NumberFormat.getInstance();
            f.setGroupingUsed(false);
            f.setMinimumFractionDigits(0);
            f.setMaximumFractionDigits(500);
    
            return f.format(doub) + "";
          }
        case Types.INTEGER:
        case Types.BIGINT:
        case Types.SMALLINT:
        case Types.TINYINT:
          Long l = rs.getLong(column);
          if (rs.wasNull()) {
            return "";
          } else {
            return l.toString();
          }
    
        case Types.VARCHAR:
        case Types.CHAR:
        case Types.LONGVARCHAR:
          String s = rs.getString(column);
          if (rs.wasNull()) {
            return "";
          } else if (StringUtils.isEmpty(s)) {
            return "";
          } else {
            return StringEscapeUtils.escapeCsv(s);
          }
    
        case Types.NVARCHAR:
        case Types.NCHAR:
        case Types.LONGNVARCHAR:
          String n = rs.getNString(column);
          if (rs.wasNull()) {
            return "";
          } else if (StringUtils.isEmpty(n)) {
            return "";
          } else {
            return StringEscapeUtils.escapeCsv(n);
          }
    
        default:
          LOG.warn("Error no handler for " + coltype);
          return "";
        }
      }
    
      public static StringBuffer exportData(Connection conn, String sql, String delimiter, boolean includeHeader,
        String end_of_line, SimpleDateFormat dateFormat, SimpleDateFormat dateTimeFormat,
        SimpleDateFormat timeFormat) throws SQLException, ParseException, IOException {
    
        StringBuffer s = new StringBuffer();
        PreparedStatement ps = null;
        ResultSet rs = null;
        DatabaseMetaData dbMetadata = conn.getMetaData();
        try {
          if (DatabaseType.MYSQL.matches(dbMetadata.getDatabaseProductName())) {
            ps = conn.prepareStatement(sql, java.sql.ResultSet.TYPE_FORWARD_ONLY,
              java.sql.ResultSet.CONCUR_READ_ONLY);
            ps.setFetchSize(Integer.MIN_VALUE);
          } else {
            ps = conn.prepareStatement(sql);
          }
    
          rs = ps.executeQuery();
    
          ResultSetMetaData md = rs.getMetaData();
          int cols = md.getColumnCount();
          if (includeHeader) {
            for (int i = 0; i < cols; i++) {
              s.append("\"").append(md.getColumnName(i + 1)).append("\"");
              if (i + 1 < cols) {
                s.append(delimiter);
              } else {
                s.append(end_of_line);
              }
            }
          }
          while (rs.next()) {
            for (int i = 0; i < cols; i++) {
              s.append(getFormattedValue(rs, i + 1, dateFormat, dateTimeFormat, timeFormat));
              if (i + 1 < cols) {
                s.append(delimiter);
              } else {
                s.append(end_of_line);
              }
            }
          }
        } finally {
          try {
            if (rs != null) {
              rs.close();
            }
          } catch (Exception e) {
          }
          try {
            if (ps != null) {
              ps.close();
            }
          } catch (Exception e) {
          }
        }
        return s;
      }
    
      public static Long exportBufferedDataToFile(Connection conn, String sql, String delimiter, boolean includeHeader,
        String end_of_line, ContentService cs, Long documentToOverwrite, Long folder, String name, String extension,
        SimpleDateFormat dateFormat, SimpleDateFormat dateTimeFormat, SimpleDateFormat timeFormat)
        throws SQLException, ParseException, IOException, Exception {
    
        PreparedStatement ps = null;
        CallableStatement calls = null;
        ResultSet rs = null;
        String filePath = null;
        Document d = null;
        BufferedWriter bw = null;
        try {
          // check to see if Oracle Stored Proc since
          // we MUST register an output for result sets to be returned
          // support single OUT parameter as a cursor
          DatabaseMetaData dmd = conn.getMetaData();
          if (DatabaseType.ORACLE.matches(dmd.getDatabaseProductName()) &&
            (sql.toLowerCase().startsWith("call ") || sql.toLowerCase().startsWith("{call "))) {
            calls = conn.prepareCall(sql);
            // register the sysref OUT cursor
            calls.registerOutParameter(1, TYPE_ORACLE_CURSOR);
            calls.executeUpdate();
            rs = (ResultSet) calls.getObject(1);
          } else {
            if (DatabaseType.MYSQL.matches(dmd.getDatabaseProductName())) {
              ps = conn.prepareStatement(sql, java.sql.ResultSet.TYPE_FORWARD_ONLY,
                java.sql.ResultSet.CONCUR_READ_ONLY);
              ps.setFetchSize(Integer.MIN_VALUE);
            } else {
              ps = conn.prepareStatement(sql);
            }
            rs = ps.executeQuery();
          }
    
          if (!rs.isBeforeFirst()) {
            // no data so don't return anything
            return null;
          }
    
          // prepare file
          d = registerDocument(cs, documentToOverwrite, folder, name, extension);
          filePath = d.getInternalFilename();
          // create the missing folder structure if it does not exist
          File newFile = new File(filePath);
          if (!newFile.getParentFile().exists()) {
            newFile.getParentFile().mkdirs();
          }
          // set up writer
          bw = Files.newBufferedWriter(Paths.get(filePath), StandardCharsets.UTF_8);
          // start writing data
          ResultSetMetaData md = rs.getMetaData();
          int cols = md.getColumnCount();
          if (includeHeader) {
            for (int i = 0; i < cols; i++) {
              writeTextToFile(bw, "\"" + md.getColumnName(i + 1) + "\"");
              if (i + 1 < cols) {
                writeTextToFile(bw, delimiter);
              } else {
                writeTextToFile(bw, end_of_line);
              }
            }
          }
          while (rs.next()) {
            for (int i = 0; i < cols; i++) {
              writeTextToFile(bw, getFormattedValue(rs, i + 1, dateFormat, dateTimeFormat, timeFormat));
              if (i + 1 < cols) {
                writeTextToFile(bw, delimiter);
              } else {
                writeTextToFile(bw, end_of_line);
              }
            }
          }
        } catch (Exception e) {
          // generic exception catch all - use this block to clean up
          try {
            if (d != null) {
              // if document object has been created, try to delete it
              cs.deleteVersion(d.getId(), ContentConstants.VERSION_CURRENT, true);
            }
          } catch (Exception ex) {
          }
    
          throw new Exception(e);
        } finally {
          // close RS
          try {
            if (rs != null) {
              rs.close();
            }
          } catch (Exception e) {
          }
          try {
            if (ps != null) {
              ps.close();
            }
          } catch (Exception e) {
          }
    
          // close Writer
          try {
            if (bw != null) {
              closeBufferedWriter(bw);
            }
          } catch (Exception e) {
    
          }
        }
    
        // finalize document attributes
        File file = new File(filePath);
        d.setSize((int) file.length());
        cs.setSizeOfDocumentVersion(d.getId());
        return d.getId();
      }
    
      private static Document registerDocument(ContentService cs, Long document, Long folder, String name,
        String extension) throws Exception {
    
        // boolean isDebug = LOG.isDebugEnabled();
        ContentFilter cf = new ContentFilter(ContentConstants.TYPE_DOCUMENT);
        cf.setName(name);
        cf.setExtension(new String[] { extension });
    
        Long docId = null;
        if (document != null) {
          Document d = (Document) cs.getVersion(document, ContentConstants.VERSION_CURRENT);
          d.setFileSystemId(ContentConstants.ALLOCATE_FSID);
          Approval a = cs.createVersion(d, ContentConstants.UNIQUE_FOR_PARENT);
          docId = a.getId()[0];
        } else {
          ResultPage children = cs.getChildrenPaging(folder, cf, ContentConstants.GC_MOD_NORMAL, 0, 1, ContentConstants.COLUMN_NONE,
            Constants.SORT_ORDER_ASCENDING);
          if (children == null || children.getAvailableItems() == 0) {
            Document d = new Document();
            d.setName(name);
            d.setExtension(extension);
            d.setSize(1);
            d.setParent(folder);
            d.setState(ContentConstants.STATE_ACTIVE_PUBLISHED);
            docId = cs.create(d, ContentConstants.UNIQUE_FOR_PARENT);
          } else {
            Document d = (Document) children.getResults()[0];
            d.setFileSystemId(ContentConstants.ALLOCATE_FSID);
            Approval a = cs.createVersion(d, ContentConstants.UNIQUE_FOR_PARENT);
            docId = a.getId()[0];
          }
        }
    
        if (docId != null) {
          Document d = cs.download(docId, ContentConstants.VERSION_CURRENT, false)[0];
          d.setId(docId);
          return d;
        }
        return null;
      }
    
      public static void writeTextToFile(BufferedWriter bw, String text) throws IOException {
        bw.write(text);
      }
    
      public static void closeBufferedWriter(BufferedWriter bw) throws IOException {
        bw.flush();
        bw.close();
      }
    
      public static void addBatchInsert(PreparedStatement ps_, String[] values_, int[] columnTypes_,
        boolean[] columnEncryptedFlags_, SimpleDateFormat dateFormat, DateTimeFormatter dateTimeFormat,
        SimpleDateFormat timeFormat, EncryptionService es)
        throws SQLException, ParseException, DateTimeParseException {
    
        for (int i = 0; i < values_.length; i++) {
          if (columnEncryptedFlags_[i]) {
            values_[i] = es.decrypt(new TypedValue((long) AppianType.ENCRYPTED_TEXT, values_[i]));
          }
          switch (columnTypes_[i]) {
          case Types.DATE:
            if (StringUtils.isEmpty(values_[i])) {
              ps_.setNull(i + 1, columnTypes_[i]);
            } else {
              ps_.setDate(i + 1, new java.sql.Date(dateFormat.parse(values_[i]).getTime()));
            }
            break;
          case Types.TIMESTAMP:
            if (StringUtils.isEmpty(values_[i])) {
              ps_.setNull(i + 1, columnTypes_[i]);
            } else {
              ps_.setTimestamp(i + 1, parseTimestamp(dateTimeFormat, values_[i]));
            }
            break;
          case Types.TIME:
            if (StringUtils.isEmpty(values_[i])) {
              ps_.setNull(i + 1, columnTypes_[i]);
            } else {
              ps_.setTime(i + 1, new java.sql.Time(timeFormat.parse(values_[i]).getTime()));
            }
            break;
          case Types.DOUBLE:
          case Types.FLOAT:
          case Types.DECIMAL:
          case Types.NUMERIC:
          case Types.REAL:
            if (StringUtils.isEmpty(values_[i])) {
              ps_.setNull(i + 1, columnTypes_[i]);
            } else {
              ps_.setDouble(i + 1, new Double(values_[i]));
            }
            break;
          case Types.BIT:
          case Types.BOOLEAN:
            if (StringUtils.isEmpty(values_[i])) {
              ps_.setNull(i + 1, columnTypes_[i]);
            } else if (values_[i].equals("0") || values_[i].equals("1")) {
              ps_.setBoolean(i + 1, values_[i].equals("0") ? false : true);
            } else {
              ps_.setBoolean(i + 1, Boolean.parseBoolean(values_[i]));
            }
            break;
          case Types.INTEGER:
          case Types.BIGINT:
          case Types.SMALLINT:
          case Types.TINYINT:
            if (StringUtils.isEmpty(values_[i])) {
              ps_.setNull(i + 1, columnTypes_[i]);
            } else {
              ps_.setInt(i + 1, new Integer(values_[i]));
            }
            break;
          case Types.VARCHAR:
          case Types.CHAR:
          case Types.LONGVARCHAR:
            if (values_[i] == null) {
              ps_.setNull(i + 1, columnTypes_[i]);
            } else if (StringUtils.isEmpty(values_[i])) {
              ps_.setString(i + 1, "");
            } else {
              ps_.setString(i + 1, values_[i]);
            }
            break;
          case Types.NVARCHAR:
          case Types.NCHAR:
          case Types.LONGNVARCHAR:
            if (values_[i] == null) {
              ps_.setNull(i + 1, columnTypes_[i]);
            } else if (StringUtils.isEmpty(values_[i])) {
              ps_.setNString(i + 1, "");
            } else {
              ps_.setNString(i + 1, values_[i]);
            }
            break;
          default:
            LOG.error("Error no handler for " + columnTypes_[i]);
          }
        }
        ps_.addBatch();
      }
    
      public static String getInsertSQL(DatabaseType dbType, String tableName, String[] columns) {
        String columnsAsList = dbType.startQuoteSymbol + StringUtils.join(columns, dbType.endQuoteSymbol + ", " + dbType.startQuoteSymbol) +
          dbType.endQuoteSymbol;
    
        StringBuffer insertSql = new StringBuffer("insert into ");
        appendTableNameWithQuotes(dbType, insertSql, tableName);
    
        insertSql.append(" (").append(columnsAsList).append(")");
        insertSql.append(" values (");
        for (int i = 0; i < columns.length; i++) {
          insertSql.append("?");
          if (i < columns.length - 1) {
            insertSql.append(", ");
          }
        }
        insertSql.append(")");
        return insertSql.toString();
      }
    
      public static String getInsertSQL(DatabaseType dbType, String tableName, String[] columns, String[] extraColumns) {
        String columnsAsList = dbType.startQuoteSymbol + StringUtils.join(columns, dbType.endQuoteSymbol + ", " + dbType.startQuoteSymbol) +
          dbType.endQuoteSymbol;
        String extraColumnsAsList = "";
    
        if (extraColumns != null) {
          extraColumnsAsList = dbType.startQuoteSymbol +
            StringUtils.join(extraColumns, dbType.endQuoteSymbol + ", " + dbType.startQuoteSymbol) + dbType.endQuoteSymbol;
        }
    
        StringBuffer insertSql = new StringBuffer("insert into ");
    
        appendTableNameWithQuotes(dbType, insertSql, tableName);
    
        insertSql.append(" (").append(columnsAsList);
    
        if (extraColumns != null) {
          insertSql.append(", ");
          insertSql.append(extraColumnsAsList);
        }
    
        insertSql.append(")");
        insertSql.append(" values (");
        for (int i = 0; i < columns.length; i++) {
          insertSql.append("?");
          if ((i < columns.length - 1) || (extraColumns != null)) {
            insertSql.append(", ");
          }
        }
    
        if (extraColumns != null) {
          for (int i = 0; i < extraColumns.length; i++) {
            insertSql.append("?");
            if (i < extraColumns.length - 1) {
              insertSql.append(", ");
            }
          }
        }
    
        insertSql.append(")");
        return insertSql.toString();
      }
    
      public static DatabaseType getDatabaseType(String dbName) {
        LOG.debug("Database name is: " + dbName);
        if (DatabaseType.POSTGRE.matches(dbName)) {
            return DatabaseType.POSTGRE;
        }
        else if (DatabaseType.ORACLE.matches(dbName)) {
          return DatabaseType.ORACLE;
        } else if (DatabaseType.MSSQL.matches(dbName)) {
          return DatabaseType.MSSQL;
        } else if (DatabaseType.DB2.matches(dbName)) {
          return DatabaseType.DB2;
        } else if (DatabaseType.MARIADB.matches(dbName)) {
          return DatabaseType.MARIADB;
        } else {
          return DatabaseType.MYSQL;
        }
      }
    
      public static void appendTableNameWithQuotes(DatabaseType dbType, StringBuffer sqlBuilder, String tableName) {
        String[] tableNameSplit = tableName.split("\\.");
        sqlBuilder.append(dbType.startQuoteSymbol);
        sqlBuilder.append(StringUtils.join(tableNameSplit, dbType.endQuoteSymbol + "." + dbType.startQuoteSymbol));
        sqlBuilder.append(dbType.endQuoteSymbol);
      }
    
      private static Timestamp parseTimestamp(DateTimeFormatter dateTimeFormat, String value) throws DateTimeParseException {
        LocalDateTime ldt = LocalDateTime.parse(value, dateTimeFormat);
        return Timestamp.valueOf(ldt);
      }
    }
    

  • Hi - we are running into a show-stopper situation, wherein one of the smart services offered as part of this offering - "Import CSV to Database" is not working for PostGre SQL database (one of the databases supported/ compatible with Appian platform). 

    We are getting the below error: 

    • Error in v5 plugin: syntax error at or near "`" Position: 15 
    • Error in v6 plugin: java.lang.reflect.UndeclaredThrowableException

    Since this is quite critical, can someone please confirm if someone has made this working with PostGre?

  • Hi , 

    I was able to use the queryappianlogs function with the older version of Excel tools. But, when i updated this latest Excel tools plugin, i am getting error message and the data is not shown. Can you please fix this in this in the latest version?

    Please find below the error log in the application server log.

    ERROR: Bundle com.appiancorp.ps.exceltools [61] Unable to get module class path. (java.lang.NullPointerException)
    java.lang.NullPointerException
    	at org.apache.felix.framework.BundleRevisionImpl.calculateContentPath(BundleRevisionImpl.java:432)
    	at org.apache.felix.framework.BundleRevisionImpl.initializeContentPath(BundleRevisionImpl.java:365)
    	at org.apache.felix.framework.BundleRevisionImpl.getContentPath(BundleRevisionImpl.java:351)
    	at org.apache.felix.framework.BundleRevisionImpl.getResourceLocal(BundleRevisionImpl.java:494)
    	at org.apache.felix.framework.BundleWiringImpl.findClassOrResourceByDelegation(BundleWiringImpl.java:1617)
    	at org.apache.felix.framework.BundleWiringImpl.getResourceByDelegation(BundleWiringImpl.java:1460)
    	at org.apache.felix.framework.BundleWiringImpl$BundleClassLoader.getResource(BundleWiringImpl.java:2597)
    	at com.ibm.db2.jcc.am.wd.run(wd.java:49)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at com.ibm.db2.jcc.am.GlobalProperties.a(GlobalProperties.java:146)
    	at com.ibm.db2.jcc.am.GlobalProperties.d(GlobalProperties.java:100)
    	at com.ibm.db2.jcc.am.dr.run(dr.java:124)
    	at java.util.TimerThread.mainLoop(Timer.java:555)
    	at java.util.TimerThread.run(Timer.java:505)
    java.lang.NullPointerException
    ERROR: Bundle com.appiancorp.ps.exceltools [61] Unable to get module class path. (java.lang.NullPointerException)
    java.lang.NullPointerException
    	at org.apache.felix.framework.BundleRevisionImpl.calculateContentPath(BundleRevisionImpl.java:432)
    	at org.apache.felix.framework.BundleRevisionImpl.initializeContentPath(BundleRevisionImpl.java:365)
    	at org.apache.felix.framework.BundleRevisionImpl.getContentPath(BundleRevisionImpl.java:351)
    	at org.apache.felix.framework.BundleRevisionImpl.getResourceLocal(BundleRevisionImpl.java:494)
    	at org.apache.felix.framework.BundleWiringImpl.findClassOrResourceByDelegation(BundleWiringImpl.java:1617)
    	at org.apache.felix.framework.BundleWiringImpl.getResourceByDelegation(BundleWiringImpl.java:1460)
    	at org.apache.felix.framework.BundleWiringImpl$BundleClassLoader.getResource(BundleWiringImpl.java:2597)
    	at com.ibm.db2.jcc.am.wd.run(wd.java:49)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at com.ibm.db2.jcc.am.GlobalProperties.a(GlobalProperties.java:146)
    	at com.ibm.db2.jcc.am.GlobalProperties.d(GlobalProperties.java:100)
    	at com.ibm.db2.jcc.am.dr.run(dr.java:124)
    	at java.util.TimerThread.mainLoop(Timer.java:555)
    	at java.util.TimerThread.run(Timer.java:505)
    java.lang.NullPointerException
    
  • we are using 2.2.11 and missing smart service Export SQL data to CSV.

    Can that be added ?

  • Hello,

    Does this plugin no longer include the Export CDT to Excel smart service? If it is no longer supported, has anything replaced it?  I have a use case that requires exporting a CDT, datasubset, map, or some other type that is not from the DSE to excel.

    Thank you,
    Walker

  • I am also having a similar use case. If this is addressed it would be very helpful!

  • Hello,

    We would like to have an enhancement for readexcelsheet function.

    Nature of this function:

    • If the entire row is empty it will not read it/bring it to Appian
    • There is no limit on how many rows to be read

    What is needed?:
    Can we introduce 2 new parameters?

    • Allow nulls to be included
      Reason: When we do a validation check on the rows, we need to mention which row is having the validation error. Due to an empty row in middle, row calculation will be wrong as this function is not including any empty rows in the output. To avoid the row count issue, we would like to enable this feature.
    • Max row parameter
      As we are allowing null values, we should add some kind of restriction on how many rows to be read. By default, we can define some max limit. 50000 Rows max?
  • v2.2.11 Release Notes
    • Fixed backwards compatibility issues with 2.2.10
  • Hello,

    Does the current version of this plugin allow export of a datasubset or CDT to excel, and allow for download of that excel from the UI?  If so, which functions would be used?  I also noticed there are currently two versions of this plugin accessible through the cloud.

    Thank you,

    Walker