Strategy for handling oracle tables with number PK values > 2.2 billion

Hello All:

We are using oracle, and struggling to future proof our application to handle a large number of rows and PK values for certain tables/views.

The sequence to insert values apparently only works with number data types, which puts the limit of 2^32-1 for values that Appian can handle.

Casting the PK to VARCHAR and in Appian, Text is causing a failure to insert for the CDT/Data Store.

Is there any strategy we can use to allow PK numbers > 2.2 billion?

OriginalPostID-273724

  Discussion posts and replies are publicly visible

Parents
  • Can you give an example of how you did the xsd mapping for a full CDT? When I create a CDT with a single field, the PK of the view which is NUMBER data type mapped to Appian Text type, the Data Store successfully verifies, and query works as you have stated. But as soon as I add another field to the CDT, I get the error for the previously working element: The data source schema does not match the type mappings: Wrong column type in [] for column []. Found: number, expected: varchar2(255 char). Thanks
Reply
  • Can you give an example of how you did the xsd mapping for a full CDT? When I create a CDT with a single field, the PK of the view which is NUMBER data type mapped to Appian Text type, the Data Store successfully verifies, and query works as you have stated. But as soon as I add another field to the CDT, I get the error for the previously working element: The data source schema does not match the type mappings: Wrong column type in [] for column []. Found: number, expected: varchar2(255 char). Thanks
Children
No Data