Toad's Import Table Data silently truncating characters (2001 limit)?

I have a CSV file, with some records having up to 4000 characters for a particular field. The table I am inserting in has those fields set as VARCHAR2(4000). It’s been tried with both 4000 Bytes and Chars.

Using the Import Table Data wizard, it loads all records 100% successfully with no errors reported but when I look at the table, the maximum number of characters loaded in the records that had up to 4000 is 2001. The fields were truncated with no warnings or errors.

The only setting in the Import Data wizard that seems relevant (even though the field size is set to 4000 for the table) is ‘If data is too large for a string field’. This is set to Prompt/Abort.

To test further, I created an SQL Loader script for the CSV file using FieldName CHAR(4000) in the CTL file. That worked fine and loaded all the characters. If there is some equivalent setting in Toad around loading in long strings that affects Import Table Data I haven’t found it.

Has anyone experienced this scenario before, or have any suggestions as to what may be causing it?

The DB is Oracle 10gR2. Toad for Oracle 12.1.0.22.

Thanks

Andrew

Hi

I think this has hardly anything with Toad but with database character set.
So, what is your database character set? Read it from Toad as Database|Administer|NLS Settings-Database tab?

Beside that please let us know your NLS_SESSION parameters as well (Session tab)

Brg,

Damir

Hi Damir,

I agree it’s unlikely to be Toad, but as it currently works with SQL Loader and fails with Import Table Data in Toad I’m open to suggestions.

I’ve tried on two different databases with the same result (different character sets, same database version). The character sets were US7ASCII and AL32UTF8.

These are the session parameters for the US7ASCII database (the session parameters are the same for the another database anyway):

Session Parameters.jpeg

Thanks

Andrew

HI Andrev

Please send me on mail (you can see in mine profile) create table script (with indexes all) as well as file you are trying to import.

I assume it is csv text file and would try that in mine database … maybe it is a bug …