Import huge CSV to Database

A Score Level 1

Hello, I have used the smart service "Import CSV to Database v5" to import a huge CSV (250.000 rows and 177MB) into a table. The problem is that it only imports part of the rows (210000 rows). Each time I debug the process model, the node finishes 2 minutes after it begins. I think it seems to be a time out of the node, and if it would last a bit more then all the rows should be written. I have searched in order to find a way to write in little groups of rows into the table in order to skip the problem of the time out but I only have found a way to do it in case the document is an excel (with the function readexcelsheet). Does anyone know how to import this huge CSV? Thank you for your time. 

Parents
  • If this only needs to happen once then I would do it manually using existing RDBMS tools

    If this is something that needs to happen on a regular basis then I would argue that moving around that much data via CSV is not the appropriate technical solution.  An API call, a database connection, basically anything other than moving files around.

Reply
  • If this only needs to happen once then I would do it manually using existing RDBMS tools

    If this is something that needs to happen on a regular basis then I would argue that moving around that much data via CSV is not the appropriate technical solution.  An API call, a database connection, basically anything other than moving files around.

Children
No Data

 Discussion posts and replies are publicly visible