I have a CSV file, with some records having up to 4000 characters for a particular field. The table I am inserting in has those fields set as VARCHAR2(4000). It’s been tried with both 4000 Bytes and Chars.
Using the Import Table Data wizard, it loads all records 100% successfully with no errors reported but when I look at the table, the maximum number of characters loaded in the records that had up to 4000 is 2001. The fields were truncated with no warnings or errors.
The only setting in the Import Data wizard that seems relevant (even though the field size is set to 4000 for the table) is ‘If data is too large for a string field’. This is set to Prompt/Abort.
To test further, I created an SQL Loader script for the CSV file using FieldName CHAR(4000) in the CTL file. That worked fine and loaded all the characters. If there is some equivalent setting in Toad around loading in long strings that affects Import Table Data I haven’t found it.
Has anyone experienced this scenario before, or have any suggestions as to what may be causing it?
The DB is Oracle 10gR2. Toad for Oracle 18.104.22.168.