Hi everyone,
I'm trying to load json to db using this Appian smart serivces. My sample json data i have given below. It has multiple dynamics parameter for example user have multiple types of phone number. if it flattens json file then its working without any issue but sub paramters are not working as expected. Added more details below
Main Issue while accessing the json sub parameters which is dynamic
In smart service Load JSON to a database, "JSON key/Column Name mapping" Field for json fields key i have tried like this Phone.number as well as Phone[].number but while executing Process model considered as "FirstName", "Phone" as column instead of each table column fields like FirstName, PhoneType PhoneNumber. I cant able to navigate inside of sub paramters while mapping json key to table columns name
Plugin name: JSON Tools
Smart service name: Load JSON to a database
my sample appian rule
{ "FirstName": "First_Name", "Phone[].number": """Phone_Number""",
}
and
{"FirstName": "First_Name","Phone.number": """Phone_Number""",
Sample JSON:
{ "FirstName": "Fred", "Phone": [ { "type": "home", "number": "0203 544 1234" }, { "type": "office", "number": "01962 001234" }, { "type": "office", "number": "01962 001235" }, { "type": "mobile", "number": "077 7700 1234" } ] }
Discussion posts and replies are publicly visible
while debugging JSON Tools plugin seem this plugin consider FirstName and Phone as table name. i dont know why "Load JSON to a database" alone Appian written like this but Schema validations using "Validate a JSON against a schema" working correctly even dynamic parameters. Hard to debug since there was no proper documentations in plugin and even plugin provided images are having outdated parameters. Anyone faced this issue and is there any way to overcome this?
I would suggest implementing a new process to format the JSON into a record or another structure with the required format. After that, use 'Write Records' or 'Write to Data Store Entity' to persist the information
yes i too thought same but in this issue its a kind of stagging table, just to load large no of columns temporarily around 100+ columns
In this case, any approach is complicated due to the complexity of the data structure you have. I understand that the data needs to be inserted into multiple tables within an ER model. If that's the case, create a record structure in the Data Fabric, and then write to each record separately by extracting the relevant data for each one.
yeah, that approach only i tried earlier but just thought to explore this tool