Loading huge data into an oracle environment using TOAD for Data Analyst

Hi all, I got like over 2GB text file which I needed to load into an Oracle Environment. I am currently using Toad for Data Analyst to do this… However, it takes more than 10 hours. I also encountered some issues along the way…

  1. Original file has row count of 2.4 million… but then I only get counts like few hundred thousands after importing it to the oracle envrionment. Even though I was shown the results of the import has the same number of records from the original text file.
  • sometimes there’s a popup window with options to Roll Back, or Keep Open.
  1. One of the text files has got more records so when I was doing the import wizard it was throwing me back an error:
    Parsing Error! Please try other delimeter quote or file encoding
    Arithmetic operation resulted in an overflow

Can someone please guide me how to import huge data into an oracle environment using TOAD for Data Analyst or suggest which TOAD product I should be using to do this better.

Thanks in advance

  1. RE: sometimes there’s a popup window with options to Roll Back, or Keep Open.
    You need to read that message carefully. The first error it encounters it will popup and tell you what the problem is. You can stop and roll back at this point or continue and ingnore all of the rest of the issues. So they can be many rows that error but you will only get one popup. So I would first focus on corrected that error.

You can however, choose to import and put all error rows into a separate file. The last page of Import says “If error occurs, save unimported rows to discard file”. Chosse that option. And then you can deal with your rows separately.

  1. The parsing error is in regards to understanding the file. There are many options and the preview window let’s you see how TDA is parsing the file. You may need to change these options as different applications export data differently, embedded quotes, etc.

  2. There is also a different technique you might consider. Here you don’t use an export file. You connection directly to the import source and execute a query. Then define the import to the targe table. This is a much more manageable way to handle large amounts of data.

These techniques are discussed in this video. I would take a vew minutes and watch.


If you still have issues post a scrren shot of the error and preveiw window and/or send a 2000 row sample of your data. (or atleast include some of the rows that error in item 1)


Hi Debbie,

Thanks for your reply!

I have watched the video that you posted.

For me, the popup window in 1) says that import was successfull with the number of 2.4m row counts - no errors encountered. However, when I checked my row counts in Oracle, it is showing only around 200k rows instead of 2.4m.

  • as I mentioned during the import process, a popup window would appear from time to time which dissappears in 5-7 seconds. I can’t remember what it was about but I think it has something to do with keeping the connection open or timeout error. Is there a configuration that would allow me to keep the connection open forever?
  1. For the parsing error, I have tried to split the 2.4 mil records into half and the import process seemed to worked fine. Except it result to 1) issue where not all rows were actually imported.

  2. I only have the text file as the source of data needed to be imported into toad. So the other technique you mentioned in your video doesn’t seem applicable.

I will try to capture a screenshot of the error next time.


I love using TDA, but I must say there are things it is not the best suited tool for.

If you are looking at loading fast such a huge amount of data, I would recommend using sqlldr.


Alexandre DOLIQUE.

This is a product I would like to have from Toad (ETL). I use SQL Server’s SSIS to perform simple ETL for tasks like this. SSIS is platform agnostic but requires a SQL Server installation (we only use Teradata). On the other hand, I don’t really need something as robust as, say, Informatica. The most I typically do is move data from one database to another (a lot of E&L; not so much T). This might be useful if E&L are a part of TDP.

In a pinch, you can use MS Access for minor ETL (about the only thing it’s good for ducks).