System out of memory error when importing in automation script

Hi,

I am having a problem with importing a 129 mb txt file into my Oracle table when running the import in an Automation script. I test the import on the full file size by itself and it has worked fine, but when I use it in an automation script that exports the file from a query against SQL Server and then try to import that file into Oracle I get this error. Any help would be appreciated. I can’t tell if there is some sort of lock on the file when it tries to import, or if running the export before the import really is making the desktop run out of memory.

Other note: the error file says the file has no data, but it clearly does when I open the file.

Thanks in advance,

Greg

Your file has 76 columns in it and the default buffer size of 500 was too large. To resolve change all of the blockInsertValue values to 100 in the ModuleSettngs\ImportExportSettings.xml. This is located in the Application Data Directory accessible from the About box.

Debbie