Taking a file that has about (45,000 records, 40 columns) and importing into a table takes about 7 hours to complete. I am using Toad Data Point Version 4.2.1.303(64 bit). Has anyone else experience importing being that slow?
What type of database are you importing into? We have bulk processing for imports for Oracle, DB2, SQL Server and MySQL. These are only if you use the native connectors and do not have the option to collect errors.
I am importing data into a Teradata table.
We don't have bulk processing for that provider. The only thing i can suggest is to uncheck the optimize block size and increase the row count size. This may help.
I am also having this issue but I am importing to a SQL Server database. The automation script log shows that it is importing one record at a time. It is a SQL Server connection and not an ODBC connection. I have a support case open and waiting to hear back from them, but any suggestions are welcome.
What is the support case number? I’d like to look at it
Hi Debbie,
The SR Number is 4368827.
Thanks,
Todd C.
I asked them to subcase so dev and try to replicate.
Hi Debbie,
Thanks for the assistance. Is there some setting in the Toad options that could be disabling the import of multiple records at a time like is done with a bcp command line import or bulk insert statement.
Regards,
Todd C.
this is entered as QAT-13697 for a fix but will not make it into the TDP 5.0 release. the issue is caused by csv file being located on the network. Move csv locally before importing as a short term solution.
Debbie,
I have updated the case as it is still only inserting one record at a time after using local files.
Regards,
T.
when you moved the csv file locally did you turn back on the Optimize block size option? I will have our QA look into this again but did want to confirm you made that change also.